Open aosong01 opened 1 year ago
I have the same query
The first for loop modifies the following blocks:
down_blocks.0.attentions.0.transformer_blocks.0
down_blocks.0.attentions.1.transformer_blocks.0
down_blocks.1.attentions.0.transformer_blocks.0
down_blocks.1.attentions.1.transformer_blocks.0
down_blocks.2.attentions.0.transformer_blocks.0
down_blocks.2.attentions.1.transformer_blocks.0
up_blocks.1.attentions.0.transformer_blocks.0
up_blocks.1.attentions.1.transformer_blocks.0
up_blocks.1.attentions.2.transformer_blocks.0
up_blocks.2.attentions.0.transformer_blocks.0
up_blocks.2.attentions.1.transformer_blocks.0
up_blocks.2.attentions.2.transformer_blocks.0
up_blocks.3.attentions.0.transformer_blocks.0
up_blocks.3.attentions.1.transformer_blocks.0
up_blocks.3.attentions.2.transformer_blocks.0
mid_block.attentions.0.transformer_blocks.0
The second for loop modifies:
up_blocks.1.attentions.1.transformer_blocks.0.attn1
up_blocks.1.attentions.2.transformer_blocks.0.attn1
up_blocks.2.attentions.0.transformer_blocks.0.attn1
up_blocks.2.attentions.1.transformer_blocks.0.attn1
up_blocks.2.attentions.2.transformer_blocks.0.attn1
up_blocks.3.attentions.0.transformer_blocks.0.attn1
up_blocks.3.attentions.1.transformer_blocks.0.attn1
up_blocks.3.attentions.2.transformer_blocks.0.attn1
Which is a subset of the first for loop.
according to the comment, the first block of the lowest resolution shouldn't have extended attention registered. the first for loop registers extended attention for that block as well.
同问
I think the valid function should be register_extended_attention_pnp
where a list injection_schedule
is defined.
https://github.com/omerbt/TokenFlow/blob/8ae24e9d00069ffec24407a60f1d410f43035393/tokenflow_utils.py#L203-L214
The injection is activated according to injection_schedule
.
https://github.com/omerbt/TokenFlow/blob/8ae24e9d00069ffec24407a60f1d410f43035393/tokenflow_utils.py#L124-L130
https://github.com/omerbt/TokenFlow/blob/8ae24e9d00069ffec24407a60f1d410f43035393/tokenflow_utils.py#L86-L91
BTW, I tried removing the first loop in L203-L206 and found the result was not changed. However, when removing the second loop in L208-L214, the result would get worse.
it seems that you change all the basictransformerblock in both down_blocks, mid_blocks and up_blocks. why still change the up_blocks in the unet again?