ajbrock / BigGAN-PyTorch

The author's officially unofficial PyTorch BigGAN implementation.
MIT License
2.87k stars 476 forks source link

Looking forward to offical TPU-version BigGAN code~ #48

Open AaronAnima opened 5 years ago

AaronAnima commented 5 years ago

Hi ajbrock, Thanks for open-source BigGAN code, which benefits a lot on other explore of BigGAN. But considering the cost of time, we hope to train biggan on TPU, so we use a version of tensorflow code that looks close to yours(https://github.com/Octavian-ai/BigGAN-TPU-TensorFlow). But many of the attempts turned out to be bad. Can you continue to release the official biggan code of TPU version?

Sincerely

config: Imagenet2012 tf-1.12.2 GCP v3-8 pod training for up to 300k steps

Some failed results: 100k: 100k samples 180k: 180k samples 300k: 300k samples

shizhediao commented 4 years ago

Hi, I implemented three TPU enabled PyTorch training repos for BigGAN-PyTorch, all of which are based on this repo.

BigGAN-PyTorch-TPU-Single: Training BigGAN with a single TPU. BigGAN-PyTorch-TPU-Parallel: Parallel version (multiple-thread) for training BigGAN with TPU. BigGAN-PyTorch-TPU-Distribute: Distributed version (multiple-process) for training BigGAN with TPU.

You might want to have a try. Pull requests to fix some of the issues would be appreciated.