Recent studies demonstrated that lip and jaw movements during speech may provide important information for the diagnosis of amyotrophic lateral sclerosis (ALS) and for understanding its progression. A thorough investigation of these movements is essential for the development of intelligent video - or optically-based facial tracking systems that could assist with early diagnosis and progress monitoring. In this paper, we investigated the potential for a novel and expanded set of kinematic features obtained from lips and jaw to classify articulatory data into three stages of bulbar disease progression (i.e., pre-symptomatic, early symptomatic, and late symptomatic). Feature selection methods (Relief-F and mRMR) and classification algorithm (SVM) were used for this purpose. Results showed that even with a limited number of kinematic features it was possible to obtain good classification accuracy (nearly 80%). Given the recent development of video-based markerless methods for tracking speech movements, these results provide strong rationale for supporting the development of portable and cheap systems for monitoring the orofacial function in ALS.

Classification of bulbar ALS from kinematic features of the jaw and lips: Towards computer-mediated assessment / Bandini, A.; Green, J. R.; Zinman, L.; Yunusova, Y.. - 2017-:(2017), pp. 1819-1823. ( 18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017 Stockholm, SWEDEN AUG 20-24, 2017) [10.21437/Interspeech.2017-478].

Classification of bulbar ALS from kinematic features of the jaw and lips: Towards computer-mediated assessment

Bandini A.;
2017

Abstract

Recent studies demonstrated that lip and jaw movements during speech may provide important information for the diagnosis of amyotrophic lateral sclerosis (ALS) and for understanding its progression. A thorough investigation of these movements is essential for the development of intelligent video - or optically-based facial tracking systems that could assist with early diagnosis and progress monitoring. In this paper, we investigated the potential for a novel and expanded set of kinematic features obtained from lips and jaw to classify articulatory data into three stages of bulbar disease progression (i.e., pre-symptomatic, early symptomatic, and late symptomatic). Feature selection methods (Relief-F and mRMR) and classification algorithm (SVM) were used for this purpose. Results showed that even with a limited number of kinematic features it was possible to obtain good classification accuracy (nearly 80%). Given the recent development of video-based markerless methods for tracking speech movements, these results provide strong rationale for supporting the development of portable and cheap systems for monitoring the orofacial function in ALS.
2017
18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017
Stockholm, SWEDEN
AUG 20-24, 2017
2017-
1819
1823
Bandini, A.; Green, J. R.; Zinman, L.; Yunusova, Y.
Classification of bulbar ALS from kinematic features of the jaw and lips: Towards computer-mediated assessment / Bandini, A.; Green, J. R.; Zinman, L.; Yunusova, Y.. - 2017-:(2017), pp. 1819-1823. ( 18th Annual Conference of the International Speech Communication Association, INTERSPEECH 2017 Stockholm, SWEDEN AUG 20-24, 2017) [10.21437/Interspeech.2017-478].
File in questo prodotto:
File Dimensione Formato  
2017_Bandini_INTERSPEECH_ALS.PDF

Accesso riservato

Tipologia: Tesi di dottorato
Licenza: [IR] closed
Dimensione 181.37 kB
Formato Adobe PDF
181.37 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1401695
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 13
social impact