datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
8.24k stars 985 forks source link

Building wheel for flash-attn (setup.py) ... - 卡住了。 #99

Closed zhangyc closed 5 months ago

zhangyc commented 5 months ago

root@autodl-container-ba6544adff-463e93c8:~# MAX_JOBS=8 pip install flash-attn --no-build-isolation Looking in indexes: http://mirrors.aliyun.com/pypi/simple Collecting flash-attn Using cached http://mirrors.aliyun.com/pypi/packages/72/94/06f618bb338ec7203b48ac542e73087362b7750f9c568b13d213a3f181bb/flash_attn-2.5.8.tar.gz (2.5 MB) Preparing metadata (setup.py) ... done Requirement already satisfied: torch in ./miniconda3/lib/python3.10/site-packages (from flash-attn) (2.1.2+cu121) Requirement already satisfied: einops in ./miniconda3/lib/python3.10/site-packages (from flash-attn) (0.8.0) Requirement already satisfied: packaging in ./miniconda3/lib/python3.10/site-packages (from flash-attn) (23.2) Collecting ninja (from flash-attn) Using cached http://mirrors.aliyun.com/pypi/packages/6d/92/8d7aebd4430ab5ff65df2bfee6d5745f95c004284db2d8ca76dcbfd9de47/ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl (307 kB) Requirement already satisfied: filelock in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (3.13.1) Requirement already satisfied: typing-extensions in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (4.9.0) Requirement already satisfied: sympy in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (1.12) Requirement already satisfied: networkx in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (3.2.1) Requirement already satisfied: jinja2 in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (3.1.2) Requirement already satisfied: fsspec in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (2023.12.2) Requirement already satisfied: triton==2.1.0 in ./miniconda3/lib/python3.10/site-packages (from torch->flash-attn) (2.1.0) Requirement already satisfied: MarkupSafe>=2.0 in ./miniconda3/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.3) Requirement already satisfied: mpmath>=0.19 in ./miniconda3/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn Building wheel for flash-attn (setup.py) ... - 在这里卡住,没有任何反应。求帮忙

KMnO4-zx commented 5 months ago

这个就是安装很慢 需要等一会 大概十几分钟

zhangyc commented 5 months ago

这个就是安装很慢 需要等一会 大概十几分钟

大概半个小时没反应。

zhangyc commented 5 months ago

Building wheels for collected packages: flash-attn Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [9 lines of output] fatal: not a git repository (or any of the parent directories): .git

  torch.__version__  = 2.1.2+cu121

  running bdist_wheel
  Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
  error: <urlopen error retrieval incomplete: got only 5865472 out of 120616671 bytes>
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects 完整的报错信息

zhangyc commented 5 months ago

解决方案:网络问题,开个代理,瞬间解决问题

hymhymhym123 commented 3 months ago

请问在autodl上咋用梯子?

Avon-cpu commented 2 months ago

我代理开着,也卡着了

hymhymhym123 commented 2 months ago

我是在flash attention官网上去手动下载的wheel

---原始邮件--- 发件人: @.> 发送时间: 2024年7月25日(周四) 下午3:17 收件人: @.>; 抄送: @.**@.>; 主题: Re: [datawhalechina/self-llm] Building wheel for flash-attn (setup.py) ... - 卡住了。 (Issue #99)

我代理开着,也卡着了

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>