One of the most challenging aspects of the implementation of Virtual/Mixed reality training systems is the effective simulation of real-world manipulation of the physical devices included in control interfaces like buttons, sliders, levers, knobs, etc. In this paper we describe a mockup airplane cockpit (XR-Cockpit), featuring interactive components of this kind that demonstrate the feasibility of effective simulations of device manipulation using low cost hand tracking technology and gesture recognition. Based on this system, we performed a user study to compare the effectiveness of the interaction with virtual tools using different visualization solutions: immersive VR, optical and video see-through based MR. In our study, we also checked how well it is possible to perform manipulation of real objects wearing the two video see-through solutions. The analysis of the experimental results provides useful guidelines for the design of Virtual and Mixed Reality training systems involving virtual and physical actions on manipulation devices.

XR-Cockpit: A comparison of VR and AR solutions on an interactive training station / Caputo, A.; Jacota, S.; Krayevskyy, S.; Pesavento, M.; Pellacini, F.; Giachetti, A.. - 2020-:(2020), pp. 603-610. (Intervento presentato al convegno 25th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2020 tenutosi a Vienna; Austria nel 2020) [10.1109/ETFA46521.2020.9212043].

XR-Cockpit: A comparison of VR and AR solutions on an interactive training station

Pellacini F.;
2020

Abstract

One of the most challenging aspects of the implementation of Virtual/Mixed reality training systems is the effective simulation of real-world manipulation of the physical devices included in control interfaces like buttons, sliders, levers, knobs, etc. In this paper we describe a mockup airplane cockpit (XR-Cockpit), featuring interactive components of this kind that demonstrate the feasibility of effective simulations of device manipulation using low cost hand tracking technology and gesture recognition. Based on this system, we performed a user study to compare the effectiveness of the interaction with virtual tools using different visualization solutions: immersive VR, optical and video see-through based MR. In our study, we also checked how well it is possible to perform manipulation of real objects wearing the two video see-through solutions. The analysis of the experimental results provides useful guidelines for the design of Virtual and Mixed Reality training systems involving virtual and physical actions on manipulation devices.
2020
25th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2020
Vienna; Austria
2020
2020-
603
610
Caputo, A.; Jacota, S.; Krayevskyy, S.; Pesavento, M.; Pellacini, F.; Giachetti, A.
XR-Cockpit: A comparison of VR and AR solutions on an interactive training station / Caputo, A.; Jacota, S.; Krayevskyy, S.; Pesavento, M.; Pellacini, F.; Giachetti, A.. - 2020-:(2020), pp. 603-610. (Intervento presentato al convegno 25th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2020 tenutosi a Vienna; Austria nel 2020) [10.1109/ETFA46521.2020.9212043].
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1299571
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 2
social impact