akashmaity / RobustPPG

9 stars 1 forks source link

RobustPPG: camera-based robust heart rate estimation using motion cancellation

              ezgif com-gif-maker

Overview

This is an implementation of the paper "RobustPPG: camera-based robust heart rate estimation using motion cancellation", Akash Kumar Maity, Jian Wang, Ashutosh Sabharwal and Shree K. Nayar, in Biomedical Optics. (* indicates equal contribution.)

In this work, we develop a motion-robust algorithm, labeled RobustPPG, for extracting photoplethysmography signals (PPG) from face video and estimating the heart rate. Our key innovation is to explicitly model and generate motion distortions due to the movements of the person’s face. Finally, we use the generated motion distortion to filter the motion-induced measurements. The overall results show improvement over the state-of-the art methods.

Publication: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9664884/.

Presentation video : https://www.youtube.com/watch?v=uxMm4vJhvFA.

Dataset

Download our RICE-Motion dataset here.

Instructions for running the code

  1. Download the pre-trained model and some examples of preprocessed data containing landmarks and face mesh information. Please paste these two folders in the code directory. Please note that in this work, we use FaceMesh from Snap Inc. for face tracking and fitting. One may use other methods like ARKit, MediaPipe or recent face trackers to generate face mesh from a face video.
  2. Run the main_start.m to generate surface normal estimates and the pixel intensity fluctuations for each traingle in the face mesh. The result in saved in *_processed.mat.
  3. Run the main_process.m to extract the PPG signal and heart rate from the distorted pixel intensity fluctuations.