facebookresearch / pytorch_GAN_zoo

A mix of GAN implementations including progressive growing
BSD 3-Clause "New" or "Revised" License
1.62k stars 271 forks source link

Why final activation of the generator is linear for PGAN #119

Closed harrygcoppock closed 4 years ago

harrygcoppock commented 4 years ago

Hi,

I was wondering why the activation function of the output layer of the generator is set to linear? From what I have seen the in paper only the discriminator has to have a linear final activation?

Thanks in advance

Molugan commented 4 years ago

Hello,

Are you talking of Progressive Growing of GAN ? In this case the generator does not have an activation function in its final layer (see https://github.com/facebookresearch/pytorch_GAN_zoo/blob/master/models/trainer/standard_configurations/pgan_config.py#L60 and https://github.com/facebookresearch/pytorch_GAN_zoo/blob/master/models/loss_criterions/base_loss_criterions.py#L62).

harrygcoppock commented 4 years ago

Hi,

Thank you for your speedy reply.

Yes for PGAN, sorry for not specifying. But from what I read, in the PGAN paper, the generator does have an activation function. It is just the discriminator that doesn't have an activation. Or have I mis understood something?

thank you for your time.

harrygcoppock commented 4 years ago

My bad they do use linear activation in the generator for the Celeb-HQ dataset.