Closed Icy-green closed 3 years ago
attention map is used in here https://github.com/VITA-Group/EnlightenGAN/blob/90903c3f9086583baf450474505d4dd1d5495ae0/models/networks.py#L736
Although this line of code was present,it was not called because your which_model_netG selected 'unet_256' instead of 'sid_unet_resize'. Could you please help me to answer the question?
When I was executing the code, I noticed that A_Gray was the attention map mentioned in the paper, but A_Gray was not sent to the network for training. Could you please help me to answer the question?