Topaz1618 / CycleganSA

💻 🐈 Added a self-attention layer to the CycleGAN implementation (PyTorch).
Other
10 stars 0 forks source link

how attention part apply #2

Open ats4869 opened 1 year ago

ats4869 commented 1 year ago

Can you tell me how the attention mechanism is applied? I have been looking at your source code for a long time and can't see whether the attention mechanism is applied to the generator or discriminator. Thanks for sharing

ats4869 commented 1 year ago

I see that you have created several files in the model folder with the name my, but the content is the same as the original files