Closed john09282922 closed 6 months ago
Hi! @john09282922
Instead of 'attn1', use 'attn2' in below codes.
save self-attention features https://github.com/HyelinNAM/ContrastiveDenoisingScore/blob/285d1ccce307e1d8c0298bcdfb29de7776b5d691/pipeline_cds.py#L166-L169
get & load self-attention features > calculate cut loss https://github.com/HyelinNAM/ContrastiveDenoisingScore/blob/285d1ccce307e1d8c0298bcdfb29de7776b5d691/pipeline_cds.py#L190-L195
Dear Hyelin,
I would like to follow cross-attention method on the figure 6 in your paper. How can I use cross-attention on your code through code modification?
thanks, john