RahulBhalley / progressive-growing-of-gans.pytorch

ProGAN with Standard, WGAN, WGAN-GP, LSGAN, BEGAN, DRAGAN, Conditional GAN, InfoGAN, and Auxiliary Classifier GAN training methods
https://arxiv.org/abs/1710.10196
MIT License
50 stars 5 forks source link

pytorch on windows does not support multiprocessing #6

Open powerspowers opened 5 years ago

powerspowers commented 5 years ago

I'm attempting to figure out which bits of the code attempt to spawn new processes since this fails on Windows 10 pytorch. I had to modify pcgan in a similar manner to get it to work recently.

powerspowers commented 5 years ago

Wrapping the main calls at the bottom of pggan.py in main seems to have solved the multiprocessing problem

if __name__ == '__main__':

powerspowers commented 5 years ago

Also appears you have to remove Volatile=true since that's been deprecated

powerspowers commented 5 years ago

In addition all data[0] references need to be switched to item() which is a result of the Volatile tag being deprecated

https://github.com/pytorch/pytorch/issues/6061

powerspowers commented 5 years ago

The last thing I had to do to get this working on Windows 10 pytorch was change all the directory and file creation paths to use backslashes instead of forward slashes. I also have to remove the '-p' flag in mkdir calls. I commented out the tensorboard code section until I can spend some time installing it. So far it seems to be running well on my Dell XPS with an nVidia 1080 (6GB).

powerspowers commented 5 years ago

Also had to add in the following modification because DataParallel puts the original object into the module variable

self.G.module.grow_network(floor(self.resl)) self.D.module.grow_network(floor(self.resl))

powerspowers commented 5 years ago

In intermediate_block you have to cast the ndim as an int because in python3 dividing two integers results in a float. grrr only collided into this one when the resl got to 6 and the code to divide ndim by two clicked in.

ndim = int(ndim / 2)