comfyanonymous / ComfyUI_examples

Examples of ComfyUI workflows
https://comfyanonymous.github.io/ComfyUI_examples/
Other
1.96k stars 328 forks source link

1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False) #31

Open ritmototal opened 3 months ago

ritmototal commented 3 months ago

Hello,

This might be slowing down my rendering capabilities from what I have been reading a few other people have had this issue recently on fresh installs but I cant seem to find a fix. I have tried uninstalling and reinstalling, but the error persists. Does anyone know how to resolve this?

C:\Program Files\ComfyUI\ComfyUI\comfy\ldm\modules\attention.py:407: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.) out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)

scaruslooner commented 3 months ago

I have this problem too. Did you find any solutions?