heykeetae / Self-Attention-GAN

Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
2.53k stars 475 forks source link

Generator class why use Self_Attn( 128, 'relu')? Can I use other number? #15

Closed c1a1o1 closed 6 years ago

c1a1o1 commented 6 years ago

self.attn1 = Self_Attn( 128, 'relu') self.attn2 = Self_Attn( 64, 'relu')