Open Zars19 opened 3 weeks ago
And the result of test_mem_eff_attention.py is: 2489 failed, 3483 passed, 9033 skipped, 36 warnings in 2539.42s (0:42:19)
Hi @Zars19
CUTLASS
-related extensions are only compiled for CUDA (not ROCm) builds.
The currently failing tests are related to the pytorch-internal Flash Attention implementation, and this op should be disabled on ROCm due to lack of support of tested features
❓ Questions and Help
Some features appear to be unavailable when executing 'python -m xformers.info' (cutlassF, smallkF, ...) Is this normal?