Kid-Liet / Reg-GAN

149 stars 21 forks source link

Any reason not using larger batch size? #6

Closed ydzhang12345 closed 2 years ago

ydzhang12345 commented 2 years ago

Hi,

I notice in the paper and also the code that batch size was set to 1 in the experiment. Any reason not using a larger batch size? Thank you!

Kid-Liet commented 2 years ago

Hi,

I notice in the paper and also the code that batch size was set to 1 in the experiment. Any reason not using a larger batch size? Thank you!

Because instance normalization is adopted. It is suitable for generating models, and the results of image generation mainly depend on a certain image instance, normalization of the whole batch is not suitable for image stylization. Using instance normalization in style migration can not only accelerate model convergence, but also maintain the independence between each image instance.

ydzhang12345 commented 2 years ago

yes but since you are using instance norm you should be able to use a larger batch size; if, in your code you are using batch norm, then setting batch size = 1 is similar to instance norm

Kid-Liet commented 2 years ago

Oh, I misunderstood you. I have tried to use 8 / 16 batch size in pix2 pix model before, and the performance does decline, accompanied by the collapse of the model. I'm not sure what caused it, but I guess it may be the inconsistent batch size between training and testing, or it may be caused by noise. I haven't tried other batch sizes in RegGan. You can have a try.