axolotl-ai-cloud / axolotl

Go ahead and axolotl questions
https://axolotl-ai-cloud.github.io/axolotl/
Apache License 2.0
7.48k stars 808 forks source link

bump flash attention 2.5.8 -> 2.6.1 #1738

Closed winglian closed 1 month ago

winglian commented 1 month ago

We can re-run the tests once the wheels finish building -> https://github.com/Dao-AILab/flash-attention/actions/runs/9894273391

winglian commented 1 month ago

Looks like flash attention wheels are no longer being built for torch==2.3.0. Will update torch in a separate PR and rebase this.