Open AaronAnima opened 5 years ago
Hi, I implemented three TPU enabled PyTorch training repos for BigGAN-PyTorch, all of which are based on this repo.
BigGAN-PyTorch-TPU-Single: Training BigGAN with a single TPU. BigGAN-PyTorch-TPU-Parallel: Parallel version (multiple-thread) for training BigGAN with TPU. BigGAN-PyTorch-TPU-Distribute: Distributed version (multiple-process) for training BigGAN with TPU.
You might want to have a try. Pull requests to fix some of the issues would be appreciated.
Hi ajbrock, Thanks for open-source BigGAN code, which benefits a lot on other explore of BigGAN. But considering the cost of time, we hope to train biggan on TPU, so we use a version of tensorflow code that looks close to yours(https://github.com/Octavian-ai/BigGAN-TPU-TensorFlow). But many of the attempts turned out to be bad. Can you continue to release the official biggan code of TPU version?
Sincerely
config: Imagenet2012 tf-1.12.2 GCP v3-8 pod training for up to 300k steps
Some failed results: 100k: 180k: 300k: