taesungp / contrastive-unpaired-translation

Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)
https://taesung.me/ContrastiveUnpairedTranslation/
Other
2.23k stars 417 forks source link

I met batch-size problem. #117

Open edwardcho opened 2 years ago

edwardcho commented 2 years ago

Hello Sir,

When try training my-dataset (image size 512x512 3ch, png), I met batch-size error. My GPU's spec is GTX 1080 Ti. (12 G) I could train batch-size 1 only. If set 2 more, I could't train it. Is this normal??

Thanks, Edward Cho.

taesungp commented 2 years ago

Do you get an out-of-memory error? You are training at higher resolution (4 times more pixels than a 256x256 image), so you can't fit 2 or more samples in the batch.

I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.

edwardcho commented 2 years ago

Hello Sir, Yes, I got out-of-memory error on GPU when batch-size 2.

Thanks, Edward Cho.

yichuan-huang commented 3 months ago

@taesungp Hello, sir! I hope this comment finds you. You mentioned that "I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.". However, my tutor and I concluded that setting the batch size as 1 may cause serious overfitting or not coverage, and the gradient updates could become chaotic. So, can you explain why you set the batch size as 1 clearly and in detail?