stqc / AttentionGAN_JupyterNB

A simplified version of the AttentionGAN
2 stars 0 forks source link

About the background attention mask #2

Open xichen-Hu opened 2 years ago

xichen-Hu commented 2 years ago

Sorry, but I notice that in the paper, the attention generation module is supposed to generate n-1 foreground attention mask and one background attention mask, the background attention mask should mutiply the input image, however, in your code, you generate 10 foreground masks, and they are all mutiplied with the content masks. So I am wondering is that also the case in the original code? Why is that?

stqc commented 2 years ago

Sorry, but I notice that in the paper, the attention generation module is supposed to generate n-1 foreground attention mask and one background attention mask, the background attention mask should mutiply the input image, however, in your code, you generate 10 foreground masks, and they are all mutiplied with the content masks. So I am wondering is that also the case in the original code? Why is that?

If I remember correctly the attention masks need to be changed based on the output you are expecting, I had a conversation with the author about this too. I will link you with my other findings about it in a few