Open cwchenwang opened 1 year ago
Same question. And i find its really hard to understand
Same question. And i find its really hard to understand
it's preprocessing, and attn_replace is in AttentionControlEdit' forward.
Same question. And i find its really hard to understand
it's preprocessing, and attn_replace is in AttentionControlEdit' forward.
but i find in AttentionControlEdit' forward, (1 - alpha_words) equals to some zeros which means that it's still not used...do u have any idea about this? thanks a lot
Same question. And i find its really hard to understand
it's preprocessing, and attn_replace is in AttentionControlEdit' forward.
but i find in AttentionControlEdit' forward, (1 - alpha_words) equals to some zeros which means that it's still not used...do u have any idea about this? thanks a lot
alpha_words is related to cross_replace_steps, you should know cross_replace_steps can be x<1.
In the
replace_cross_attention
of AttentionReplace, whyattn_replace
is not used? I guess according to the paper, we have to replaceattn_base
with corresponding layers inattn_replace
Thank you