In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently, from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment

Sensor Fusion Localization and Navigation for Visually Impaired People / Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Pascucci, Federica; Inderst, Federica; Giarre', Laura. - (2018). (Intervento presentato al convegno European control Conference 2018 tenutosi a Limassol, Cipro nel 2018).

Sensor Fusion Localization and Navigation for Visually Impaired People

Giarre' Laura
2018

Abstract

In this paper, we present an innovative cyber physical system for indoor and outdoor localization and navigation, based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently, from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a real outdoor installation and in a controlled indoor environment
2018
European control Conference 2018
Limassol, Cipro
2018
Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Pascucci, Federica; Inderst, Federica; Giarre', Laura
Sensor Fusion Localization and Navigation for Visually Impaired People / Galioto, Giovanni; Tinnirello, Ilenia; Croce, Daniele; Pascucci, Federica; Inderst, Federica; Giarre', Laura. - (2018). (Intervento presentato al convegno European control Conference 2018 tenutosi a Limassol, Cipro nel 2018).
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1167372
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 7
social impact