blazerye / DrugAssist

DrugAssist: A Large Language Model for Molecule Optimization
https://arxiv.org/abs/2401.10334
124 stars 10 forks source link

Issue in code #2

Open AKANKSHASINGH233 opened 3 months ago

AKANKSHASINGH233 commented 3 months ago

i am not able to install-flash-attn kindly share entire requirement file.

AKANKSHASINGH233 commented 3 months ago

i was trying with python 3.8

ajaymur91 commented 3 months ago

I had a lot of trouble installing as well. The following worked for me. Hope this helps

nvidia-smi

NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0

mamba create -n drugassist4 python=3.8 pip cudatoolkit=11.7 cudatoolkit-dev=11.7 -c conda-forge -y conda activate drugassist4 pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 pip install packaging pip install -r requirements.txt --no-build-isolation

blazerye commented 3 months ago

There is no problem using python 3.8 environment. We’ve received feedback regarding dependency issues in certain situations where package requirements cannot be satisfied. We’ve updated requirements.txt, and the change is: vllm 0.1.3 -> 0.1.4, you can try again with it. Additionally, you can also try the solution provided by ajaymur91.

mengtinghuang commented 3 months ago

i am not able to install-flash-attn kindly share entire requirement file.

I faced the same problem, have you solved it?

likefei09 commented 2 months ago

i am not able to install-flash-attn kindly share entire requirement file.

Me too. Have you solved it?

likefei09 commented 2 months ago

I had a lot of trouble installing as well. The following worked for me. Hope this helps

nvidia-smi

NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0

mamba create -n drugassist4 python=3.8 pip cudatoolkit=11.7 cudatoolkit-dev=11.7 -c conda-forge -y conda activate drugassist4 pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 pip install packaging pip install -r requirements.txt --no-build-isolation

I still can't install flash-attn.