mingyuliutw / UNIT

Unsupervised Image-to-Image Translation
Other
1.98k stars 360 forks source link

Multiple GPU training? #93

Closed harsmac closed 5 years ago

harsmac commented 5 years ago

I have images with high resolution and I am running the training on GTX 1080p with 12 GB Ram. I reduced the random cropped images to 64x64 to let the training proceed. The GPU seems to run out of memory when the crop size is any greater than 64x64. Hence was wondering if it would be run on multiple GPU to speed up the training?

mingyuliutw commented 5 years ago

Mutliple GPU training is a bit tricky with the current implementation. Stay tuned. We plan to have a new iteration of the UNIT method, which would support multi GPU training and hopefully better quality.

flagman commented 5 years ago

Any update on this?

jakubLangr commented 4 years ago

Hi @mingyuliutw I am also wondering about the multi-GPU implementation?