In this chapter, we present a system for indoor and outdoor localization and navigation to allow the low vision users in experiencing cultural heritage in autonomy. The system is based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. The users with the help of the navigation system may experience the museum or the cultural site in autonomy, by following a path previously decided and by going from a location of interest to another, without any external personal assistant. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a controlled indoor environment and in a real outdoor installation

Supporting Autonomous Navigation of Visually Impaired People for Experiencing Cultural Heritage / Daniele, Croce; Giovanni, Galioto; Natale, Galioto; Domenico, Garlisi; Giarrè, Laura; Federica, Inderst; Federica, Pascucci; Ilenia, Tinnirello. - (2020), pp. 25-46. [10.1007/978-3-030-36107-5_2]

Supporting Autonomous Navigation of Visually Impaired People for Experiencing Cultural Heritage

Laura Giarré;
2020

Abstract

In this chapter, we present a system for indoor and outdoor localization and navigation to allow the low vision users in experiencing cultural heritage in autonomy. The system is based on the joint utilization of dead-reckoning and computer vision techniques on a smartphone-centric tracking system. The system is explicitly designed for visually impaired people, but it can be easily generalized to other users, and it is built under the assumption that special reference signals, such as colored tapes, painted lines, or tactile paving, are deployed in the environment for guiding visually impaired users along pre-defined paths. Differently from previous works on localization, which are focused only on the utilization of inertial sensors integrated into the smartphones, we exploit the smartphone camera as an additional sensor that, on one side, can help the visually impaired user to identify the paths and, on the other side, can provide direction estimates to the tracking system. The users with the help of the navigation system may experience the museum or the cultural site in autonomy, by following a path previously decided and by going from a location of interest to another, without any external personal assistant. We demonstrate the effectiveness of our approach, by means of experimental tests performed in a controlled indoor environment and in a real outdoor installation
2020
Rediscovering Heritage Through Technology
978-3-030-36106-8
springer
Supporting Autonomous Navigation of Visually Impaired People for Experiencing Cultural Heritage / Daniele, Croce; Giovanni, Galioto; Natale, Galioto; Domenico, Garlisi; Giarrè, Laura; Federica, Inderst; Federica, Pascucci; Ilenia, Tinnirello. - (2020), pp. 25-46. [10.1007/978-3-030-36107-5_2]
Daniele, Croce; Giovanni, Galioto; Natale, Galioto; Domenico, Garlisi; Giarrè, Laura; Federica, Inderst; Federica, Pascucci; Ilenia, Tinnirello
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1200594
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact