Closed xiezhy6 closed 2 years ago
@xiezhy6 We don't plan to release the multi-GPU training code yet. If you use DataParallel in pytorch library, you can do multi-GPU training easily :)
@xiezhy6 @koo616 I also want to modify the train_condition.py
into a DataParallel version.
Although directly warpping the ConditionGenerator and Discriminator with DataParallelWithCallback
is easy, it is unknown if this could make training unstable and harm the final performance.
Can you give me some advices? I will train condition generator with DataParallel and share my results here. It is also welcomed if you already have some results with DataParallel experiment. Thanks!!
Hi,
Thanks for releasing the training code.
I would like to train the condition generator with another dataset (with a resolution of 512 x 384). However, running the complete 300000 steps under the default setting takes a long time (> 130h). So I would like to ask whether the author plans to release the multi-GPU version of the training code. Or is there any suggestion about how to train the condition generator within 1~2 days?