odegeasslbc / FastGAN-pytorch

Official implementation of the paper "Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image Synthesis" in ICLR 2021
GNU General Public License v3.0
600 stars 100 forks source link

missing tanh? #9

Closed tmramalho closed 3 years ago

tmramalho commented 3 years ago

In models.py on lines 168 - 178

        if self.im_size == 256:
            return [self.to_big(feat_256), self.to_128(feat_128)]

        feat_512 = self.se_512( feat_32, self.feat_512(feat_256) )
        if self.im_size == 512:
            return [self.to_big(feat_512), self.to_128(feat_128)]

        feat_1024 = self.feat_1024(feat_512)

        im_128 = torch.tanh(self.to_128(feat_128))
        im_1024 = torch.tanh(self.to_big(feat_1024))

It looks like when returning an image of size 1024, the output of modules to_128 and to_big are wrapped by a tanh, while for images 256 and 512 they are not. I guess they were all supposed to be wrapped in tanh?

odegeasslbc commented 3 years ago

yes, it was one of the experiments which I want to validate if tanh is necessary. It turns out tanh seems not to play a key role here, so I forget to add them back. Please see also the official stylegan2 implementation where they also do not use tanh for the output image. I cannot make the conclusion if tanh is useful or not, I suggest try it on your own dataset to verify, I also would like to see the outcome.