Closed winglian closed 1 month ago
We can re-run the tests once the wheels finish building -> https://github.com/Dao-AILab/flash-attention/actions/runs/9894273391
Looks like flash attention wheels are no longer being built for torch==2.3.0. Will update torch in a separate PR and rebase this.
We can re-run the tests once the wheels finish building -> https://github.com/Dao-AILab/flash-attention/actions/runs/9894273391