ubicomplab / rPPG-Toolbox

rPPG-Toolbox: Deep Remote PPG Toolbox (NeurIPS 2023)
https://arxiv.org/abs/2210.00716
Other
442 stars 106 forks source link

Question about label_type in preprocess #202

Closed Dylan-H-Wang closed 1 year ago

Dylan-H-Wang commented 1 year ago

Hi,

I am trying to understand the logic behind label_type preprocessing:

  1. For efficientPhys (config file), the data type is set to Standardized but the label type is set to DiffNormalized. Is that possibly a bug, since I saw the original paper was using DiffNormalized inputs?
  2. I saw that for all datasets, the label type is set to DiffNormalized instead of Raw or Standardized. Does is mean asking the model to predict the difference of rPPG is more helpful than directly predicting actual rPPG signals?
  3. If outputs of the model is set to rPPG difference, how can we obtain the actual rPPG signals during the inference?

Thank you!

yahskapar commented 1 year ago

Hi @Dylan-H-Wang,

  1. Not a bug, this has more to do with the EfficientPhys-C implementation utilized having its own normalization module that ultimately calculates difference frames. Check out the two lines here which I believe are relevant.

  2. Not sure if there is some confusion of terminology here, but the reason we use DiffNormalized as you pointed out is because the data type is DiffNormalized, whether it's set to that explicitly for pre-processing as an input or it eventually becomes that through a module such as the one mentioned in my first answer above. The 1st derivative of the PPG signal is being predicted.

  3. The output of the models are effectively the 1st derivative of the PPG signal. Check out this section in the README for visualizing the outputs. This should also give you an idea on how to use the .pkl files mentioned for your own code. You can also consider saving the predicted and ground truth signals just before they're used for HR calculations here.

Hope this helps.

Dylan-H-Wang commented 1 year ago

Thank you for the answer. Problem is solved and close.