Open edwardcho opened 2 years ago
Do you get an out-of-memory error? You are training at higher resolution (4 times more pixels than a 256x256 image), so you can't fit 2 or more samples in the batch.
I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.
Hello Sir, Yes, I got out-of-memory error on GPU when batch-size 2.
Thanks, Edward Cho.
@taesungp Hello, sir! I hope this comment finds you. You mentioned that "I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.". However, my tutor and I concluded that setting the batch size as 1 may cause serious overfitting or not coverage, and the gradient updates could become chaotic. So, can you explain why you set the batch size as 1 clearly and in detail?
Hello Sir,
When try training my-dataset (image size 512x512 3ch, png), I met batch-size error. My GPU's spec is GTX 1080 Ti. (12 G) I could train batch-size 1 only. If set 2 more, I could't train it. Is this normal??
Thanks, Edward Cho.