We present an innovative smartphone-centric tracking system for indoor and outdoor environments, based on the joint utilization of dead-reckoning and computer vision (CV) techniques. The system is explicitly designed for visually impaired people (although it could be easily generalized to other users) and it is built under the assumption that special reference signals, such as painted lines, colored tapes or tactile pavings are deployed in the environment for guiding visually impaired users along pre-defined paths. Thanks to highly optimized software, we are able to execute the CV and sensor-fusion algorithms in run-time on low power hardware such as a normal smartphone, precisely tracking the users movements.

We present an innovative smartphone-centric tracking system for indoor and outdoor environments, based on the joint utilization of dead-reckoning and computer vision (CV) techniques. The system is explicitly designed for visually impaired people (although it could be easily generalized to other users) and it is built under the assumption that special reference signals, such as painted lines, colored tapes or tactile pavings are deployed in the environment for guiding visually impaired users along pre-defined paths. Thanks to highly optimized software, we are able to execute the CV and sensor-fusion algorithms in run-time on low power hardware such as a normal smartphone, precisely tracking the users movements

Demo: Sensor Fusion Localization and Navigation for Visually Impaired People / Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Inderst, Federica; Pascucci, Federica; Giarrè, Laura. - (2017), pp. 471-473. (Intervento presentato al convegno 23rd Annual International Conference on Mobile Computing and Networking, MobiCom 2017 tenutosi a Snowbird, Utah, USA nel October 16 - 20, 2017) [10.1145/3117811.3119858].

Demo: Sensor Fusion Localization and Navigation for Visually Impaired People

Ilenia Tinnirello;Laura Giarré
2017

Abstract

We present an innovative smartphone-centric tracking system for indoor and outdoor environments, based on the joint utilization of dead-reckoning and computer vision (CV) techniques. The system is explicitly designed for visually impaired people (although it could be easily generalized to other users) and it is built under the assumption that special reference signals, such as painted lines, colored tapes or tactile pavings are deployed in the environment for guiding visually impaired users along pre-defined paths. Thanks to highly optimized software, we are able to execute the CV and sensor-fusion algorithms in run-time on low power hardware such as a normal smartphone, precisely tracking the users movements
2017
23rd Annual International Conference on Mobile Computing and Networking, MobiCom 2017
Snowbird, Utah, USA
October 16 - 20, 2017
471
473
Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Inderst, Federica; Pascucci, Federica; Giarrè, Laura
Demo: Sensor Fusion Localization and Navigation for Visually Impaired People / Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Inderst, Federica; Pascucci, Federica; Giarrè, Laura. - (2017), pp. 471-473. (Intervento presentato al convegno 23rd Annual International Conference on Mobile Computing and Networking, MobiCom 2017 tenutosi a Snowbird, Utah, USA nel October 16 - 20, 2017) [10.1145/3117811.3119858].
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1150960
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 5
social impact