I get an issue when executing the new attn mask Workflow. I rounded it down to variables types compatibility:
from attention.py in \comfy\ldm\modules (line 345)
out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)
I tried converting k and v with k.half() and v.half() but I cannot keep the attn_mask (only works with None). Then of course the image generated has nothing to do with what I wanted
I get an issue when executing the new attn mask Workflow. I rounded it down to variables types compatibility:
from attention.py in \comfy\ldm\modules (line 345) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)
I tried converting k and v with k.half() and v.half() but I cannot keep the attn_mask (only works with None). Then of course the image generated has nothing to do with what I wanted