google / prompt-to-prompt

Apache License 2.0
2.98k stars 279 forks source link

Understanding AttentionReplace #45

Open cwchenwang opened 1 year ago

cwchenwang commented 1 year ago

In the replace_cross_attention of AttentionReplace, why attn_replace is not used? I guess according to the paper, we have to replace attn_base with corresponding layers in attn_replace

image

Thank you

MiaoQiaowei commented 8 months ago

Same question. And i find its really hard to understand

Time-Lord12th commented 8 months ago

Same question. And i find its really hard to understand

it's preprocessing, and attn_replace is in AttentionControlEdit' forward.

zhw123456789 commented 6 months ago

Same question. And i find its really hard to understand

it's preprocessing, and attn_replace is in AttentionControlEdit' forward.

but i find in AttentionControlEdit' forward, (1 - alpha_words) equals to some zeros which means that it's still not used...do u have any idea about this? thanks a lot

Time-Lord12th commented 6 months ago

Same question. And i find its really hard to understand

it's preprocessing, and attn_replace is in AttentionControlEdit' forward.

but i find in AttentionControlEdit' forward, (1 - alpha_words) equals to some zeros which means that it's still not used...do u have any idea about this? thanks a lot

alpha_words is related to cross_replace_steps, you should know cross_replace_steps can be x<1.