Standard Ergonomic Risk Assessment (ERA) from video analysis is a highly time-consuming activity and is affected by the subjectivity of ergonomists. Motion Capture (MOCAP) addresses these limitations by allowing objective ERA. Here a depth camera, one of the most commonly used MOCAP systems for ERA (i.e. Azure Kinect), is used for the evaluation of the NIOSH Lifting Equation exploiting a tool named AzKNIOSH. First, to validate the tool, we compared its performance with those provided by a commercial software, Siemens Jack TAT, based on an Inertial Measurement Units (IMUs) suit and found a high agreement between them. Secondly, a Convolutional Neural Network (CNN) was employed for task recognition, automatically identifying the lifting actions. This procedure was evaluated by comparing the results obtained from manual detection with those obtained through automatic detection. Thus, through automated task detection and the implementation of Auto-AzKNIOSH we achieved a fully automated ERA.Practitioner Summary:The standard evaluation of the NIOSH Lifting Equation is time-consuming and subjective, thus a new automatic tool is designed, which integrates motion captures provided by Azure Kinect and task recognition. We found a high agreement between our tool and Siemens Jack TAT suit, the golden standard technology for motion capture.

Auto-AzKNIOSH: an automatic NIOSH evaluation with Azure Kinect coupled with task recognition / Lolli, F.; Coruzzolo, A. M.; Forgione, C.; Peron, M.; Sgarbossa, F.. - In: ERGONOMICS. - ISSN 0014-0139. - (2024), pp. 1-17. [10.1080/00140139.2024.2433027]

Auto-AzKNIOSH: an automatic NIOSH evaluation with Azure Kinect coupled with task recognition

Lolli F.
;
Coruzzolo A. M.;Forgione C.;
2024

Abstract

Standard Ergonomic Risk Assessment (ERA) from video analysis is a highly time-consuming activity and is affected by the subjectivity of ergonomists. Motion Capture (MOCAP) addresses these limitations by allowing objective ERA. Here a depth camera, one of the most commonly used MOCAP systems for ERA (i.e. Azure Kinect), is used for the evaluation of the NIOSH Lifting Equation exploiting a tool named AzKNIOSH. First, to validate the tool, we compared its performance with those provided by a commercial software, Siemens Jack TAT, based on an Inertial Measurement Units (IMUs) suit and found a high agreement between them. Secondly, a Convolutional Neural Network (CNN) was employed for task recognition, automatically identifying the lifting actions. This procedure was evaluated by comparing the results obtained from manual detection with those obtained through automatic detection. Thus, through automated task detection and the implementation of Auto-AzKNIOSH we achieved a fully automated ERA.Practitioner Summary:The standard evaluation of the NIOSH Lifting Equation is time-consuming and subjective, thus a new automatic tool is designed, which integrates motion captures provided by Azure Kinect and task recognition. We found a high agreement between our tool and Siemens Jack TAT suit, the golden standard technology for motion capture.
2024
1
17
Auto-AzKNIOSH: an automatic NIOSH evaluation with Azure Kinect coupled with task recognition / Lolli, F.; Coruzzolo, A. M.; Forgione, C.; Peron, M.; Sgarbossa, F.. - In: ERGONOMICS. - ISSN 0014-0139. - (2024), pp. 1-17. [10.1080/00140139.2024.2433027]
Lolli, F.; Coruzzolo, A. M.; Forgione, C.; Peron, M.; Sgarbossa, F.
File in questo prodotto:
File Dimensione Formato  
Auto-AzKNIOSH an automatic NIOSH evaluation with Azure Kinect coupled with task recognition.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 2.9 MB
Formato Adobe PDF
2.9 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1365565
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact