heykeetae / Self-Attention-GAN

Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
2.51k stars 470 forks source link

About negative gamma #47

Closed SeokjuLee closed 4 years ago

SeokjuLee commented 4 years ago

Hi, thanks for your great efforts on this project. I have a question about the "gamma" parameter. Is it natural for gamma to be trained with negative value? Is this result telling that attention has a negative effect?

harrygcoppock commented 4 years ago

My model often learns a negative gamma. I do not think that this means that it has a negative affect, the most important part is the magnitude. out = input + gamma*attentionOutput.

SeokjuLee commented 4 years ago

@harrygcoppock Thanks Harry.

csyhping commented 3 years ago

@harrygcoppock @SeokjuLee Hi, I've got the negative gamma, may I ask what value do you get?

harrygcoppock commented 3 years ago

sorry I never saved the gamma results and cannot remember. A guess was that it slowing increased to 1 or -1 but again that is just a guess. I think it plateaued at this magnitude.

csyhping commented 3 years ago

@harrygcoppock thanks for your quick reply, I got the gamma about -0.01, do you think this is normal?

harrygcoppock commented 3 years ago

The point of the gamma parameter is that at the start of training (when gamma = 0) the model can quickly learn the easier convolutional features. Then the model can slowly introduce information from the attention layer. I recall that the paper argues that this is a somewhat harder task. If your gamma value remains low maybe the model struggles to make use of the attention layer? However this is just speculation. Maybe first investigate the relative magnitudes of the skip connection and the output of the attention layer before gamma is applied (if |output from attention| >>|skip connection| then a gamma value of 0.01 could still be significant?). Finally did the gamma value plateau at this level?

csyhping commented 3 years ago

@harrygcoppock ,yes the value plateau at this level. I'm a rookie to self-attention, so I have no experience on whether -0.01 is a normal value.

harrygcoppock commented 3 years ago

I too am relatively new to the field, however if you are interested in finding out why all I can think of is my above point. That and postulating that maybe self attention is not helping you here - maybe the problem set is not a good match or you have initialised your attention layer poorly.

csyhping commented 3 years ago

@harrygcoppock thanks a lot for your kindly help!I will do more checks. Besiedes, may I ask what should be a normal value of gamma when self-attention works well? 0.1 or any?

SeokjuLee commented 3 years ago

@csyhping Hi, I can't remember the detailed value, but I think I got a similar result. As Harry commented, the magnitude of gamma gradually increased from zero.