adithya-s-k / omniparse

Ingest, parse, and optimize any data format ➡️ from documents to multimedia ➡️ for enhanced compatibility with GenAI frameworks
https://docs.cognitivelab.in
GNU General Public License v3.0
4.37k stars 350 forks source link

Building wheel for flash-attn (setup.py) ...(Stuck) #36

Closed lonly197 closed 2 weeks ago

lonly197 commented 2 weeks ago

Despite numerous attempts, even after downloading the compatible version directly from https://github.com/Dao-AILab/flash-attention/releases/ and attempting to install it, followed by executing 'pip install -e .', it still gets stuck at 'Building wheel for flash-attn (setup.py)' without any progress.

image
adithya-s-k commented 2 weeks ago

Which os and machine are you on?

lonly197 commented 2 weeks ago
adithya-s-k commented 2 weeks ago

Ok ideally it should not cause a problem Have you tried using Docker ?

adithya-s-k commented 2 weeks ago

can you try pip install -r pyproject.toml

lonly197 commented 2 weeks ago

After two hours, I received the error message 'ERROR: Failed building wheel for flash-attn'.

image image
adithya-s-k commented 2 weeks ago

https://github.com/Dao-AILab/flash-attention/issues/420

have you tried pip install flash-attn --no-build-isolation then doing pip install -e .

lonly197 commented 2 weeks ago

Dao-AILab/flash-attention#420

have you tried pip install flash-attn --no-build-isolation then doing pip install -e .

it work for me