cubiq / ComfyUI_IPAdapter_plus

GNU General Public License v3.0
3.17k stars 238 forks source link

Variable types #508

Closed Olaizola13 closed 2 months ago

Olaizola13 commented 2 months ago

I get an issue when executing the new attn mask Workflow. I rounded it down to variables types compatibility:

from attention.py in \comfy\ldm\modules (line 345) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)

I tried converting k and v with k.half() and v.half() but I cannot keep the attn_mask (only works with None). Then of course the image generated has nothing to do with what I wanted

cubiq commented 2 months ago

run with --force-fp16

Olaizola13 commented 2 months ago

Thanks!