Closed hzhaoy closed 1 day ago
flash-attn is mandatory for some models. Uninstalling and then rebuilding flash-attn (following the installation) satisfied the requirement and fixed #4242 #4264 completely.
flash-attn
Please take into consideration the GPU that does not support flash attn https://github.com/hiyouga/LLaMA-Factory/blob/8d6cd69ac43afd4bd7c14bd02b0061455827ac9e/docker/docker-cuda/Dockerfile#L8
What does this PR do?
flash-attn
is mandatory for some models. Uninstalling and then rebuildingflash-attn
(following the installation) satisfied the requirement and fixed #4242 #4264 completely.Before submitting