heykeetae / Self-Attention-GAN

Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
2.51k stars 470 forks source link

About the attention map #19

Open o0t1ng0o opened 6 years ago

o0t1ng0o commented 6 years ago

Hi, @heykeetae I had read the paper and found the attention map in the paper. But, how can i visualize the attention map? In sagan_models.py, there is a tensor about B X N X N, called "attention". And how can i utilize this tensor? Could you please give me some advise? Many thanks!

kikyou123 commented 5 years ago

@o0t1ng0o I have the same question. Can you find the way to visualize the attention map.

santiestrada32 commented 5 years ago

I have the same question, given a point in the image, how to visualize its attention map.

xingdi1990 commented 5 years ago

Same issue here

xiaosean commented 5 years ago

Same question!!

jsczzzk commented 5 years ago

same quesetion!

kehuantiantang commented 5 years ago

Actually, I try to visualize the final result after y = self.gamma * attention_x + x, but got nothing expect noise.

sybil12 commented 5 years ago

Hi, @heykeetae I had read the paper and found the attention map in the paper. But, how can i visualize the attention map? In sagan_models.py, there is a tensor about B X N X N, called "attention". And how can i utilize this tensor? Could you please give me some advise? Many thanks!

did you find the way?

AdilZouitine commented 4 years ago

Same issue

Haoru commented 4 years ago

same quesetion!

csyhping commented 3 years ago

Same issue! Can @heykeetae give some help or anyone else?

valillon commented 3 years ago

Related #54