Closed xinyazhang closed 1 month ago
Now USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py can compile correctly
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py
This is a backported version from https://github.com/pytorch/pytorch/pull/133866
Tested with USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py develop --user and python -c 'import torch'
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py develop --user
python -c 'import torch'
The PR https://github.com/ROCm/pytorch/pull/1536 was merged. MEM_EFF_ATTENTION is always turned off, when it will be enabled?
Not required.
Now
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py
can compile correctlyThis is a backported version from https://github.com/pytorch/pytorch/pull/133866
Tested with
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py develop --user
andpython -c 'import torch'