Remote photoplethysmography (rPPG) aspires to automatically estimate heart rate (HR) variability from videos in realistic environments. A number of effective methods relying on data-driven, model-based and statistical approaches have emerged in the past two decades. They exhibit increasing ability to estimate the blood volume pulse (BVP) signal upon which BPMs (Beats per Minute) can be estimated. Furthermore, learning-based rPPG methods have been recently proposed. The present pyVHR framework represents a multi-stage pipeline covering the whole process for extracting and analyzing HR fluctuations. It is designed for both theoretical studies and practical applications in contexts where wearable sensors are inconvenient to use. Namely, pyVHR supports either the development, assessment and statistical analysis of novel rPPG methods, either traditional or learning-based, or simply the sound comparison of well-established methods on multiple datasets. It is built up on accelerated Python libraries for video and signal processing as well as equipped with parallel/accelerated ad-hoc procedures paving the way to online processing on a GPU. The whole accelerated process can be safely run in real-time for 30 fps HD videos with an average speedup of around 5. This paper is shaped in the form of a gentle tutorial presentation of the framework.

pyVHR: a Python framework for remote photoplethysmography / Boccignone, G.; Conte, Donatello; Cuculo, V.; D’Amelio, Alessandro; Grossi, Giuliano; Lanzarotti, R.; Mortara, Edoardo. - In: PEERJ. COMPUTER SCIENCE.. - ISSN 2376-5992. - 8:(2022), pp. 1-37. [10.7717/peerj-cs.929]

pyVHR: a Python framework for remote photoplethysmography

V. Cuculo;
2022

Abstract

Remote photoplethysmography (rPPG) aspires to automatically estimate heart rate (HR) variability from videos in realistic environments. A number of effective methods relying on data-driven, model-based and statistical approaches have emerged in the past two decades. They exhibit increasing ability to estimate the blood volume pulse (BVP) signal upon which BPMs (Beats per Minute) can be estimated. Furthermore, learning-based rPPG methods have been recently proposed. The present pyVHR framework represents a multi-stage pipeline covering the whole process for extracting and analyzing HR fluctuations. It is designed for both theoretical studies and practical applications in contexts where wearable sensors are inconvenient to use. Namely, pyVHR supports either the development, assessment and statistical analysis of novel rPPG methods, either traditional or learning-based, or simply the sound comparison of well-established methods on multiple datasets. It is built up on accelerated Python libraries for video and signal processing as well as equipped with parallel/accelerated ad-hoc procedures paving the way to online processing on a GPU. The whole accelerated process can be safely run in real-time for 30 fps HD videos with an average speedup of around 5. This paper is shaped in the form of a gentle tutorial presentation of the framework.
2022
8
1
37
pyVHR: a Python framework for remote photoplethysmography / Boccignone, G.; Conte, Donatello; Cuculo, V.; D’Amelio, Alessandro; Grossi, Giuliano; Lanzarotti, R.; Mortara, Edoardo. - In: PEERJ. COMPUTER SCIENCE.. - ISSN 2376-5992. - 8:(2022), pp. 1-37. [10.7717/peerj-cs.929]
Boccignone, G.; Conte, Donatello; Cuculo, V.; D’Amelio, Alessandro; Grossi, Giuliano; Lanzarotti, R.; Mortara, Edoardo
File in questo prodotto:
File Dimensione Formato  
peerj-cs-929.pdf

Open access

Tipologia: Versione pubblicata dall'editore
Dimensione 6.76 MB
Formato Adobe PDF
6.76 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1300664
Citazioni
  • ???jsp.display-item.citation.pmc??? 4
  • Scopus 21
  • ???jsp.display-item.citation.isi??? 12
social impact