potamides / DeTikZify

Synthesizing Graphics Programs for Scientific Figures and Sketches with TikZ
Apache License 2.0
453 stars 11 forks source link

Pytorch doesn't compile with Flash Attention (W11) #3

Open Malik-Hacini opened 3 months ago

Malik-Hacini commented 3 months ago

When sampling an image in the webUI, I get the following error :

UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.) x = F.scaled_dot_product_attention(

Settings :

Device name: NVIDIA GeForce RTX 2050 FlashAttention available: True Cuda avalaible : True Cuda version : 12.5

potamides commented 3 months ago

How did you install pytorch? This looks like an issue with your pytorch installation rather than this project.