Facial (e.g., lips and jaw) movements can provide important information for the assessment, diagnosis and treatment of motor speech disorders. However, due to the high costs of the instrumentation used to record speech movements, such information is typically limited to research studies. With the recent development of depth sensors and efficient algorithms for facial tracking, clinical applications of this technology may be possible. Although lip tracking methods have been validated in the past, jaw tracking remains a challenge. In this study, we assessed the accuracy of tracking jaw movements with a video-based system composed of a face tracker and a depth sensor, specifically developed for short range applications (Intel® RealSense™ SR300). The assessment was performed on healthy subjects during speech and nonspeech tasks. Preliminary results showed that jaw movements can be tracked with reasonable accuracy (RMSE≈2mm), with better performance for slow movements. Further tests are needed in order to improve the performance of these systems and develop accurate methodologies that can reveal subtle changes in jaw movements for the assessment and treatment of motor speech disorders.

Video-based tracking of jaw movements during speech: Preliminary results and future directions / Bandini, A.; Namasivayam, A.; Yunusova, Y.. - 2017-:(2017), pp. 689-693. ( 18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017 swe 2017) [10.21437/Interspeech.2017-1371].

Video-based tracking of jaw movements during speech: Preliminary results and future directions

Bandini A.;
2017

Abstract

Facial (e.g., lips and jaw) movements can provide important information for the assessment, diagnosis and treatment of motor speech disorders. However, due to the high costs of the instrumentation used to record speech movements, such information is typically limited to research studies. With the recent development of depth sensors and efficient algorithms for facial tracking, clinical applications of this technology may be possible. Although lip tracking methods have been validated in the past, jaw tracking remains a challenge. In this study, we assessed the accuracy of tracking jaw movements with a video-based system composed of a face tracker and a depth sensor, specifically developed for short range applications (Intel® RealSense™ SR300). The assessment was performed on healthy subjects during speech and nonspeech tasks. Preliminary results showed that jaw movements can be tracked with reasonable accuracy (RMSE≈2mm), with better performance for slow movements. Further tests are needed in order to improve the performance of these systems and develop accurate methodologies that can reveal subtle changes in jaw movements for the assessment and treatment of motor speech disorders.
2017
18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017
swe
2017
2017-
689
693
Bandini, A.; Namasivayam, A.; Yunusova, Y.
Video-based tracking of jaw movements during speech: Preliminary results and future directions / Bandini, A.; Namasivayam, A.; Yunusova, Y.. - 2017-:(2017), pp. 689-693. ( 18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017 swe 2017) [10.21437/Interspeech.2017-1371].
File in questo prodotto:
File Dimensione Formato  
2017_Bandini_INTERSPEECH_JAW.PDF

Accesso riservato

Tipologia: VOR - Versione pubblicata dall'editore
Dimensione 356.03 kB
Formato Adobe PDF
356.03 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1401673
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? ND
social impact