Augmented reality environments, where humans can interact with both virtual and real objects, are a powerful tool to achieve a natural human-computer interaction. The recent diffusion of off-the-shelf stereoscopic visualization displays and motion capture devices has paved the way for the development of effective augmented reality systems at affordable costs. However, with the conventional approaches an user freely moving in front of a 3D display could experience a misperception of the 3D position and of the shape of virtual objects. Such distortions can have serious consequences in scientific and medical fields, where a veridical perception is required, and they can cause visual fatigue in consumer and entertainment applications. In this paper, we develop an augmented reality system, based on a novel stereoscopic rendering technique, capable to correctly render 3D virtual objects to an user that changes his/her position in the real world and acts in the virtual scenario. The proposed rendering technique has been tested in a static and in a dynamic augmented reality scenario by several observers. The obtained results confirm the improvement of the developed solution with respect to the standard systems. © Springer-Verlag Berlin Heidelberg 2013.

Veridical Perception of 3D Objects in a Dynamic Stereoscopic Augmented Reality System / Chessa, M.; Garibotti, M.; Canessa, A.; Gibaldi, A.; Sabatini, S. P.; Solari, F.. - 359:(2013), pp. 274-285. (Intervento presentato al convegno 7th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2012 tenutosi a Rome, ita nel 2012) [10.1007/978-3-642-38241-3_18].

Veridical Perception of 3D Objects in a Dynamic Stereoscopic Augmented Reality System

Canessa A.;Gibaldi A.;
2013

Abstract

Augmented reality environments, where humans can interact with both virtual and real objects, are a powerful tool to achieve a natural human-computer interaction. The recent diffusion of off-the-shelf stereoscopic visualization displays and motion capture devices has paved the way for the development of effective augmented reality systems at affordable costs. However, with the conventional approaches an user freely moving in front of a 3D display could experience a misperception of the 3D position and of the shape of virtual objects. Such distortions can have serious consequences in scientific and medical fields, where a veridical perception is required, and they can cause visual fatigue in consumer and entertainment applications. In this paper, we develop an augmented reality system, based on a novel stereoscopic rendering technique, capable to correctly render 3D virtual objects to an user that changes his/her position in the real world and acts in the virtual scenario. The proposed rendering technique has been tested in a static and in a dynamic augmented reality scenario by several observers. The obtained results confirm the improvement of the developed solution with respect to the standard systems. © Springer-Verlag Berlin Heidelberg 2013.
2013
7th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2012
Rome, ita
2012
359
274
285
Chessa, M.; Garibotti, M.; Canessa, A.; Gibaldi, A.; Sabatini, S. P.; Solari, F.
Veridical Perception of 3D Objects in a Dynamic Stereoscopic Augmented Reality System / Chessa, M.; Garibotti, M.; Canessa, A.; Gibaldi, A.; Sabatini, S. P.; Solari, F.. - 359:(2013), pp. 274-285. (Intervento presentato al convegno 7th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2012 tenutosi a Rome, ita nel 2012) [10.1007/978-3-642-38241-3_18].
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1362488
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact