Open bbecausereasonss opened 2 months ago
Do you use ZLUDA? https://github.com/comfyanonymous/ComfyUI/issues/4132
The same problem is with my UNet loader. When loading flux fp8, selecting 'default' as the pruning type is normal. However, when selecting fp8_e4m3fn
an error message will be generated: Runtime Error: CUDA error: CUBLASVNet NOT_SUPPORTED when calling 'cublasLtMatmulAlgoGetHeuristic' (ltHandle, computeDesc. scriptor(), Adesc. scriptor(), Bdesc. scriptor(), Cdesc. scriptor(), Ddesc. scriptor(), preference. scriptor(), 1,&heuristicResult,&returnedResult)`
pytorch version: 2.4.0+cu118 xformers version: 0.0.27.post2+cu118 Stable version: 5f9d5a24
cublasLtMatmulAlgoGetHeuristic' (ltHandle, computeDesc. scriptor(), Adesc. scriptor(), Bdesc. scriptor(), Cdesc. scriptor(), Ddesc. scriptor(), preference. scriptor(), 1,&heuristicResult,&returnedResult)`
It seems that you need to upgrade your pytorch 2.5.0+cu124
Expected Behavior
I'm having a heck of a time finding a working Torch to just work ... I dunno what happened, but I upraded (all) and it borked my install.. now when I try a comy lora/flux workflow that used to work before; I get this error.
Actual Behavior
CUDA error: CUBLAS_STATUS_NOT_SUPPORTED
Steps to Reproduce
Launch workflow. Get error.
Debug Logs
Other
No response