This work introduces a wearable system to provide situational awareness for blind and visually impaired people. The system includes a camera, an embedded computer and a haptic device to provide feedback when an obstacle is detected. The system uses techniques from computer vision and motion planning to (1) identify walkable space; (2) plan step-by-step a safe motion trajectory in the space, and (3) recognize and locate certain types of objects, for example the location of an empty chair. These descriptions are communicated to the person wearing the device through vibrations. We present results from user studies with low- and high-level tasks, including walking through a maze without collisions, locating a chair, and walking through a crowded environment while avoiding people

Enabling independent navigation for visually impaired people through a wearable vision-based feedback system / Hsueh-Cheng, Wang; Katzschmann, Robert K.; Santani, Teng; Brandon, Araki; Giarrè, Laura; Daniela, Rus. - (2017), pp. 6533-6540. (Intervento presentato al convegno Robotics and Automation (ICRA), 2017 IEEE International Conference on tenutosi a Singapore nel 29 May-3 June 2017) [10.1109/ICRA.2017.7989772].

Enabling independent navigation for visually impaired people through a wearable vision-based feedback system

Laura Giarré;
2017

Abstract

This work introduces a wearable system to provide situational awareness for blind and visually impaired people. The system includes a camera, an embedded computer and a haptic device to provide feedback when an obstacle is detected. The system uses techniques from computer vision and motion planning to (1) identify walkable space; (2) plan step-by-step a safe motion trajectory in the space, and (3) recognize and locate certain types of objects, for example the location of an empty chair. These descriptions are communicated to the person wearing the device through vibrations. We present results from user studies with low- and high-level tasks, including walking through a maze without collisions, locating a chair, and walking through a crowded environment while avoiding people
2017
Robotics and Automation (ICRA), 2017 IEEE International Conference on
Singapore
29 May-3 June 2017
6533
6540
Hsueh-Cheng, Wang; Katzschmann, Robert K.; Santani, Teng; Brandon, Araki; Giarrè, Laura; Daniela, Rus
Enabling independent navigation for visually impaired people through a wearable vision-based feedback system / Hsueh-Cheng, Wang; Katzschmann, Robert K.; Santani, Teng; Brandon, Araki; Giarrè, Laura; Daniela, Rus. - (2017), pp. 6533-6540. (Intervento presentato al convegno Robotics and Automation (ICRA), 2017 IEEE International Conference on tenutosi a Singapore nel 29 May-3 June 2017) [10.1109/ICRA.2017.7989772].
File in questo prodotto:
File Dimensione Formato  
mit.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 1.24 MB
Formato Adobe PDF
1.24 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1150940
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 123
  • ???jsp.display-item.citation.isi??? ND
social impact