Closed harrygcoppock closed 4 years ago
Hello,
Are you talking of Progressive Growing of GAN ? In this case the generator does not have an activation function in its final layer (see https://github.com/facebookresearch/pytorch_GAN_zoo/blob/master/models/trainer/standard_configurations/pgan_config.py#L60 and https://github.com/facebookresearch/pytorch_GAN_zoo/blob/master/models/loss_criterions/base_loss_criterions.py#L62).
Hi,
Thank you for your speedy reply.
Yes for PGAN, sorry for not specifying. But from what I read, in the PGAN paper, the generator does have an activation function. It is just the discriminator that doesn't have an activation. Or have I mis understood something?
thank you for your time.
My bad they do use linear activation in the generator for the Celeb-HQ dataset.
Hi,
I was wondering why the activation function of the output layer of the generator is set to linear? From what I have seen the in paper only the discriminator has to have a linear final activation?
Thanks in advance