Marker-less sensors are widely used for body-tracking in different applications, as Human Action Recognition (HAR) and ergonomic risk assessment. For HAR different datasets were developed to train Neural Networks (NN) able to recognize actions based on labeled skeleton videos. However, state-of-the-art datasets include many tasks but rarely there are work activities like lifting of loads or manual handling. This paper presents an expansion of Northwestern-UCLA dataset with manual handling and lifting of loads. Our contribution includes the addition of 540 acquisitions to the existing dataset where 3 different tasks are performed by 5 different subjects. The first task is not new but useful to generalize task recognition and regards picking up a box from different height and surfaces; these activities were classified as an existing label of Northwestern-UCLA dataset. While two completely new labels were added to the existing dataset, and they are: leaving a box on a table and leaving a box inside a container. In this way, the final dataset contains 12 labels and 2034 acquisitions, and it was used to train and to test a graph convolutional neural network. Results from the testing of the NN reveal a high accuracy in the recognition of the newly added actions. The work here presented aims at supporting automatic ergonomic evaluation in picking activities.

Human action recognition dataset for ergonomic risk assessment applications / Coruzzolo, A. M.; Forgione, C.; Lolli, F.; Zhao, Q.; Rimini, B.. - (2023). (Intervento presentato al convegno (2023) XXVIII AIDI Summer School-Francesco Turco tenutosi a Genova nel 6-8 settembre 2023).

Human action recognition dataset for ergonomic risk assessment applications

Coruzzolo A. M.;Forgione C.;Lolli F.;Zhao Q.;Rimini B.
2023

Abstract

Marker-less sensors are widely used for body-tracking in different applications, as Human Action Recognition (HAR) and ergonomic risk assessment. For HAR different datasets were developed to train Neural Networks (NN) able to recognize actions based on labeled skeleton videos. However, state-of-the-art datasets include many tasks but rarely there are work activities like lifting of loads or manual handling. This paper presents an expansion of Northwestern-UCLA dataset with manual handling and lifting of loads. Our contribution includes the addition of 540 acquisitions to the existing dataset where 3 different tasks are performed by 5 different subjects. The first task is not new but useful to generalize task recognition and regards picking up a box from different height and surfaces; these activities were classified as an existing label of Northwestern-UCLA dataset. While two completely new labels were added to the existing dataset, and they are: leaving a box on a table and leaving a box inside a container. In this way, the final dataset contains 12 labels and 2034 acquisitions, and it was used to train and to test a graph convolutional neural network. Results from the testing of the NN reveal a high accuracy in the recognition of the newly added actions. The work here presented aims at supporting automatic ergonomic evaluation in picking activities.
2023
(2023) XXVIII AIDI Summer School-Francesco Turco
Genova
6-8 settembre 2023
Coruzzolo, A. M.; Forgione, C.; Lolli, F.; Zhao, Q.; Rimini, B.
Human action recognition dataset for ergonomic risk assessment applications / Coruzzolo, A. M.; Forgione, C.; Lolli, F.; Zhao, Q.; Rimini, B.. - (2023). (Intervento presentato al convegno (2023) XXVIII AIDI Summer School-Francesco Turco tenutosi a Genova nel 6-8 settembre 2023).
File in questo prodotto:
File Dimensione Formato  
SummerSchool_23.pdf

Open access

Tipologia: Versione pubblicata dall'editore
Dimensione 843.46 kB
Formato Adobe PDF
843.46 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1363773
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact