maum-ai / faceshifter

Unofficial PyTorch Implementation for FaceShifter (https://arxiv.org/abs/1912.13457)
BSD 3-Clause "New" or "Revised" License
612 stars 115 forks source link

multi GPU training #16

Closed hanikh closed 3 years ago

hanikh commented 3 years ago

hi, thanks for your great code. I have a problem when using 2 GPUs. with 1 GPU the speed of the training process is about 0.75 s/it (according to progress bar) and with 2 GPUs it is about 1.33 s/it. and since the whole iterations are halved with 2 GPUs, consequently, one epoch would take almost the same time in both cases (1 and 2 GPUs) would you please help me to find out what the problem is. thanks alot

usingcolor commented 3 years ago

This is not an error nor an issue, I think. In multi GPU training pytorch-lightning use ddp. I recommend you to read this