This paper presents a wearable electromyographic gesture recognition system based on the hyperdimensional computing paradigm, running on a programmable parallel ultra-low-power (PULP) platform. The processing chain includes efficient on-chip training, which leads to a fully embedded implementation with no need to perform any offline training on a personal computer. The proposed solution has been tested on 10 subjects in a typical gesture recognition scenario achieving 85% average accuracy on 11 gestures recognition, which is aligned with the state-of-the-art, with the unique capability of performing online learning. Furthermore, by virtue of the hardware friendly algorithm and of the efficient PULP system-on-chip (Mr. Wolf) used for prototyping and evaluation, the energy budget required to run the learning part with 11 gestures is 10.04 mJ, and 83.2 muJ per classification. The system works with a average power consumption of 10.4 mW in classification, ensuring around 29 h of autonomy with a 100 mAh battery. Finally, the scalability of the system is explored by increasing the number of channels (up to 256 electrodes), demonstrating the suitability of our approach as universal, energy-efficient biopotential wearable recognition framework.
Online Learning and Classification of EMG-Based Gestures on a Parallel Ultra-Low Power Platform Using Hyperdimensional Computing / Benatti, S.; Montagna, F.; Kartsch, V.; Rahimi, A.; Rossi, D.; Benini, L.. - In: IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS. - ISSN 1932-4545. - 13:3(2019), pp. 516-528. [10.1109/TBCAS.2019.2914476]
Online Learning and Classification of EMG-Based Gestures on a Parallel Ultra-Low Power Platform Using Hyperdimensional Computing
Benatti S.;
2019
Abstract
This paper presents a wearable electromyographic gesture recognition system based on the hyperdimensional computing paradigm, running on a programmable parallel ultra-low-power (PULP) platform. The processing chain includes efficient on-chip training, which leads to a fully embedded implementation with no need to perform any offline training on a personal computer. The proposed solution has been tested on 10 subjects in a typical gesture recognition scenario achieving 85% average accuracy on 11 gestures recognition, which is aligned with the state-of-the-art, with the unique capability of performing online learning. Furthermore, by virtue of the hardware friendly algorithm and of the efficient PULP system-on-chip (Mr. Wolf) used for prototyping and evaluation, the energy budget required to run the learning part with 11 gestures is 10.04 mJ, and 83.2 muJ per classification. The system works with a average power consumption of 10.4 mW in classification, ensuring around 29 h of autonomy with a 100 mAh battery. Finally, the scalability of the system is explored by increasing the number of channels (up to 256 electrodes), demonstrating the suitability of our approach as universal, energy-efficient biopotential wearable recognition framework.Pubblicazioni consigliate
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris