ubicomplab / rPPG-Toolbox

rPPG-Toolbox: Deep Remote PPG Toolbox (NeurIPS 2023)
https://arxiv.org/abs/2210.00716
Other
442 stars 106 forks source link

The preprocess is using up cpu rather than gpu #215

Closed ajay-vishnu closed 10 months ago

ajay-vishnu commented 11 months ago

The train_loader() that's being called in main.py is using up CPU instead of GPU even though the device name is given as cuda:0. Is there any way to fix this?

yahskapar commented 11 months ago

Hi @ajay-vishnu,

This toolbox's preprocessing code, as with most preprocessing code you'll see in many projects (with some exceptions), does not leverage the GPU in any explicit way to speed-up or perform parallelized preprocessing. Instead, multiple sub-processes are spawned and utilized, as shown here in the multi_process_manager() function. You can read more about how exactly this kind of multiprocessing works here. Generally speaking, preprocessing using the CPU over the GPU is recommended for a variety of reasons, some of which are covered in this article. Obviously there are certain scenarios where preprocessing on the GPU can be helpful, especially with more complicated preprocessing procedures, but that's not the case with this toolbox and the nature of any rPPG work that I've come across.

Are you having some kind of trouble with the default CPU usage when trying to preprocess a dataset with this toolbox? Can you share more details on what exactly is happening? If you find your program is crashing during multiprocessing or you're getting cryptic OpenCV errors that seem like they're related to a process being denied a resource, I suggest modifying the multi_process_quota variable on this line in BaseLoader.py to a lower value than the default of 8. Perhaps you can start with a value of 1 just to make sure preprocessing works before increasing the quota value. If you're having some other trouble, please share more details so that we can help you.

yahskapar commented 10 months ago

I'm going to go ahead and close this since there seems to be no follow-up, but please let us know if you still have any confusion or are running into any kind of issue that I may have misunderstood @ajay-vishnu.