kabachuha / sd-webui-text2video

Auto1111 extension implementing text2video diffusion models (like ModelScope or VideoCrafter) using only Auto1111 webui dependencies
Other
1.28k stars 106 forks source link

Fix[#102](b) : RE:Fix[102](a) #126

Closed rbfussell closed 1 year ago

rbfussell commented 1 year ago

reference [102a] : https://github.com/deforum-art/sd-webui-text)2video/commit/67aaba9f0856589074384b5412c4553647f02d22 :: --attn_mask++attn_bias

Fixes second source of : memory_efficient_attention() got an unexpected keyword argument 'mask' errors.

102 was closed, but issue persisted, l:492 had variable attn_mask instead of attn_bias. Changing this and the error is no longer presenting.

kabachuha commented 1 year ago

I thought it was attn_mask when I was reading torch2 docs (and it worked for me for some reason).

Anyway, thank you greatly for finding this mistake and fixing it!

rbfussell commented 1 year ago

I thought it was attn_mask when I was reading torch2 docs (and it worked for me for some reason).

Anyway, thank you greatly for finding this mistake and fixing it!

No problem, it was bothering me for a while. When I saw your fix for 102, I just hand edited the .py to add your diff, but it still was failing, but I realized it showed a new line number and fixed that. works great since.

Do you have --xformers activated? I think this may be the difference. I don't have this error when I don't use --xformers on the command line for webui, but then I also can't run the extension since it chokes OOM. From the function it looks like it is relatred to xformers, so I imagine if you have enough vram and don't use it, it may just never be an issue.

kabachuha commented 1 year ago

I use Torch2, but I believed I tested it with xformers too by commenting out torch2 and adding --xformers to the args

rbfussell commented 1 year ago

I use Torch2, but I believed I tested it with xformers too by commenting out torch2 and adding --xformers to the args

Ah, I wonder if there is some relation to the vram on hand with xformers, I am near minimum with a 3060ti fe and 8gb. it may only have been an issue if xformers plus low vram if it worked for you in that case. I did have it work okay a couple times prior to changing the script, not sure what settings I was using at the time, it was when I first began to play with the extension.