Open christopher5106 opened 3 days ago
Cc: @yiyixuxu
hey @christopher5106 yes! do you want to open a PR?
Hi @yiyixuxu, I am working on this issue and it seems like attention_mask
is not being used by all the pipelines. Could you help me finding a case that an attention mask is being used and passed to attention processsor?
Thanks, @rootonchair! The reason why that is not the case is because the original Flux implementation doesn't really use any mask so the Flux related pipelines don't use them. So, if we were actually use the attention mask in the Flux attention processor, users will have to make sure to pass them accordingly in their implementations.
Describe the bug
Is it possible to get back the
attention_mask
argument in the flux attention processorhttps://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py#L1910
in order to tweak things a bit ? otherwise the argument
attention_mask
is unused.Thanks a lot
Reproduction
pip install diffusers
Logs
No response
System Info
Ubuntu
Who can help?
@yiyixuxu @sayakpaul @DN6 @asomoza