Boese0601 / MagicDance

[ICML 2024] MagicPose(also known as MagicDance): Realistic Human Poses and Facial Expressions Retargeting with Identity-aware Diffusion
https://boese0601.github.io/magicdance/
Other
685 stars 61 forks source link

It seems that the Dataloader is performing slower than expected. When I train using Moore's open-source AnimateAnyone code, I have noticed that the GPU utilization reaches around 90%. However, when using MagicPose, the GPU utilization is only around 20%. I have a dataset of approximately 3K videos. #14

Closed Jeff-Fudan closed 7 months ago

Jeff-Fudan commented 7 months ago

During the training process with MagicPose, the GPU frequently experiences waiting time for batch loading.

Boese0601 commented 7 months ago

Probably you should consider increasing ur num_workers to 8 or more, the default is 1. I don't know too much about how to accelerate the training since this is implemented by myself using pytorch dataloader instead of original internal pkg of ByteDance.

Jeff-Fudan commented 7 months ago

Probably you should consider increasing ur num_workers to 8 or more, the default is 1. I don't know too much about how to accelerate the training since this is implemented by myself using pytorch dataloader instead of original internal pkg of ByteDance.

Well, I have set the num_workers to 8 for both MagicPose and when running Moore's open-source code. However, the speed of MagicPose is noticeably slower compared to Moore's code.