Human Activity Recognition (HAR) makes it possible to drive applications directly from embedded and wearable sensors. Machine learning, and especially deep learning, has made significant progress in learning sensor features from raw sensing signals with high recognition accuracy. However, most techniques need to be trained on a large labelled dataset, which is often difficult to acquire. In this paper, we present ContrasGAN, an unsupervised domain adaptation technique that addresses this labelling challenge by transferring an activity model from one labelled domain to other unlabelled domains. ContrasGAN uses bi-directional generative adversarial networks for heterogeneous feature transfer and contrastive learning to capture distinctive features between classes. We evaluate ContrasGAN on three commonly-used HAR datasets under conditions of cross-body, cross-user, and cross-sensor transfer learning. Experimental results show a superior performance of ContrasGAN on all these tasks over a number of state-of-the-art techniques, with relatively low computational cost.

ContrasGAN: Unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning / Rosales Sanabria, Andrea; Zambonelli, Franco; Dobson, Simon; Ye, Juan. - In: PERVASIVE AND MOBILE COMPUTING. - ISSN 1574-1192. - 78:(2021), pp. 1-30. [10.1016/j.pmcj.2021.101477]

ContrasGAN: Unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning

Franco Zambonelli;
2021

Abstract

Human Activity Recognition (HAR) makes it possible to drive applications directly from embedded and wearable sensors. Machine learning, and especially deep learning, has made significant progress in learning sensor features from raw sensing signals with high recognition accuracy. However, most techniques need to be trained on a large labelled dataset, which is often difficult to acquire. In this paper, we present ContrasGAN, an unsupervised domain adaptation technique that addresses this labelling challenge by transferring an activity model from one labelled domain to other unlabelled domains. ContrasGAN uses bi-directional generative adversarial networks for heterogeneous feature transfer and contrastive learning to capture distinctive features between classes. We evaluate ContrasGAN on three commonly-used HAR datasets under conditions of cross-body, cross-user, and cross-sensor transfer learning. Experimental results show a superior performance of ContrasGAN on all these tasks over a number of state-of-the-art techniques, with relatively low computational cost.
2021
2021
78
1
30
ContrasGAN: Unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning / Rosales Sanabria, Andrea; Zambonelli, Franco; Dobson, Simon; Ye, Juan. - In: PERVASIVE AND MOBILE COMPUTING. - ISSN 1574-1192. - 78:(2021), pp. 1-30. [10.1016/j.pmcj.2021.101477]
Rosales Sanabria, Andrea; Zambonelli, Franco; Dobson, Simon; Ye, Juan
File in questo prodotto:
File Dimensione Formato  
2021_contrasGAN_PMC.pdf

Open access

Tipologia: Versione dell'autore revisionata e accettata per la pubblicazione
Dimensione 5.19 MB
Formato Adobe PDF
5.19 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1298862
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 23
  • ???jsp.display-item.citation.isi??? 16
social impact