junyanz / pytorch-CycleGAN-and-pix2pix

Image-to-Image Translation in PyTorch
Other
22.71k stars 6.27k forks source link

Normalization layer #1005

Open zwep opened 4 years ago

zwep commented 4 years ago

Hi,

is there a reason why you can re-use the normalization layer across the model? I thought that is better to initialize a new norm-layer each time. I see that you can initiate a batch-norm layer without any tracking of statistics. So in that case, everything goes fine. But there is an option to do the tracking of the statistics. In that case, doesnt the model go awry?

junyanz commented 4 years ago

We use different normalization layers across the model. The get_norm_layer returns a layer class, rather than a layer instance. You can modify this function to track statistics.

zwep commented 4 years ago

Ah yeah I see, thanks.

I asked this because when I used this model in torch's .eval() mode, I get really strange results. I've read that this could happen if you use the same layer across the model, but that is not the case then. So hopefully if I put the tracking statistics off it behaves normally.

junyanz commented 4 years ago

Yes. The results look better without eval() mode.