Truly ubiquitous computing poses new and significantchallenges. One of the key aspects that will condition theimpact of these new tecnologies is how to obtain a manageablerepresentation of the surrounding environment startingfrom simple sensing capabilities. This will make devicesable to adapt their computing activities on an everchangingenvironment. This paper presents a frameworkto promote unsupervised training processes among differentsensors. This framework allows different sensors to exchangethe needed knowledge to create a model to classifyevents. In particular we developed, as a case study,a multi-modal multi-sensor classification system combiningdata from a camera and a body-worn accelerometer to identifythe user motion state. The body-worn accelerometerlearns a model of the user behavior exploiting the informationcoming from the camera and uses it later on to classifythe user motion in an autonomous way. Experimentsdemonstrate the accuracy of the proposed approach in differentsituations.

Pervasive Self-Learning with multi-modal distributed sensors / Bicocchi, Nicola; Mamei, Marco; Prati, Andrea; Cucchiara, Rita; Zambonelli, Franco. - STAMPA. - (2008), pp. 61-66. (Intervento presentato al convegno 2nd IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops, SASOW 2008 tenutosi a Venice, Italy nel October 20-October 24 2008) [10.1109/SASOW.2008.51].

Pervasive Self-Learning with multi-modal distributed sensors

BICOCCHI, Nicola;MAMEI, Marco;PRATI, Andrea;CUCCHIARA, Rita;ZAMBONELLI, Franco
2008

Abstract

Truly ubiquitous computing poses new and significantchallenges. One of the key aspects that will condition theimpact of these new tecnologies is how to obtain a manageablerepresentation of the surrounding environment startingfrom simple sensing capabilities. This will make devicesable to adapt their computing activities on an everchangingenvironment. This paper presents a frameworkto promote unsupervised training processes among differentsensors. This framework allows different sensors to exchangethe needed knowledge to create a model to classifyevents. In particular we developed, as a case study,a multi-modal multi-sensor classification system combiningdata from a camera and a body-worn accelerometer to identifythe user motion state. The body-worn accelerometerlearns a model of the user behavior exploiting the informationcoming from the camera and uses it later on to classifythe user motion in an autonomous way. Experimentsdemonstrate the accuracy of the proposed approach in differentsituations.
2008
2nd IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops, SASOW 2008
Venice, Italy
October 20-October 24 2008
61
66
Bicocchi, Nicola; Mamei, Marco; Prati, Andrea; Cucchiara, Rita; Zambonelli, Franco
Pervasive Self-Learning with multi-modal distributed sensors / Bicocchi, Nicola; Mamei, Marco; Prati, Andrea; Cucchiara, Rita; Zambonelli, Franco. - STAMPA. - (2008), pp. 61-66. (Intervento presentato al convegno 2nd IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops, SASOW 2008 tenutosi a Venice, Italy nel October 20-October 24 2008) [10.1109/SASOW.2008.51].
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/618704
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 5
social impact