Closed binarybeastt closed 2 weeks ago
Hi, from https://github.com/Dao-AILab/flash-attention/issues/667 it seems that Undefined symbols
is due to some incompatibilities. Can you try re-installing flash-attention, or see if the discussions in that issue help?
pip uninstall flash-attn
pip install --no-cache-dir --no-build-isolation flash-attn
Reinstalling flash attention works, thank you.
Thank you for your good work
While trying to fine-tune the interleave 0.5B model, I keep running into errors which I don't quite understand, but they're related to flash attention, for more context, I'm using 8 nvidia A100 gpus.