maum-ai / faceshifter

Unofficial PyTorch Implementation for FaceShifter (https://arxiv.org/abs/1912.13457)
BSD 3-Clause "New" or "Revised" License
612 stars 115 forks source link

Should affine be set as False in your ADD layers? #18

Closed rshaojimmy closed 3 years ago

rshaojimmy commented 3 years ago

Hi, Thanks for your great implementation!

In the original paper, batch normalization is conducted without affine parameters (they are replaced with attributes and identities modulation parameters).

So in this way, we should explicitly set the flag of affine of BN in the ADD layers as false as follows,

self.BNorm = nn.BatchNorm2d(h_inchannel, affine=False)

Thanks for your response.

usingcolor commented 3 years ago

I think you are right. I only tried with my code, so I don't know how it affects to the training.