Open gracopordeus opened 1 year ago
If you can modify the source code this might be helpful.
But if not, downgrade the version of diffusers to v0.16.0 where the Attention
module uses nn.Linear
rather than the LoRACompatibleLinear
which causes the issue above.
I'm trying to work in kaggle notebooks.
This is the function called: