jiahe7ay / MINI_LLM

This is a repository used by individuals to experiment and reproduce the pre-training process of LLM.
348 stars 53 forks source link

在安装flash-attn这个包的时候卡了特别特别久 #3

Open Zengfanxu1111 opened 7 months ago

Zengfanxu1111 commented 7 months ago

请问大佬遇到过这种情况吗?

Building wheels for collected packages: flash-attn Building wheel for flash-attn (setup.py) ... /(在这一直转)

jiahe7ay commented 7 months ago

这是因为他要连github下载子模块,你可以试下自己编译,或者直接下载whl包安装

iissy commented 2 months ago

B715B4E3-9649-4b32-9389-37F5B38C7C88

我这里安装还安装不了,要配置CUDA_HOME 环境变量,我没有安装CUDA工具包,不知道是否一定要安装呢?