brain-research / self-attention-gan

Apache License 2.0
1k stars 174 forks source link

Attention map visualization #11

Closed santiestrada32 closed 5 years ago

santiestrada32 commented 6 years ago

Hi,

I am having trouble understanding the right way to visualize the attention maps. Lets day the attention block is in the last layer and the image has w=128, h=128, that means the attention map as dimensions N=w*h.

if I want to visualize the attention map for the midpoint for example. which part of the attention map should I access?

The only idea I got was to obtain either the row or the column n:

attention_map[:,n] or attention_map[n,:]

Could you explain how to correctly access the attention map for a specific point?

Thanks in advance

iamsiddhantsahu commented 5 years ago

@santiestrada32 Have the same question

yifanjiang19 commented 5 years ago

Hi, @santiestrada32. Did you figure out how to visualize the attention map? Thanks.

betterze commented 5 years ago

same question

matak07 commented 4 years ago

Why is this query closed without answering? Did anyone get the solution? If so, could anyone please share it?