axolotl-ai-cloud / axolotl

Go ahead and axolotl questions
https://axolotl-ai-cloud.github.io/axolotl/
Apache License 2.0
7.93k stars 873 forks source link

Cannot install on Google Colab #1933

Open benjamin-marie opened 1 month ago

benjamin-marie commented 1 month ago

Please check that this issue hasn't been reported before.

Expected Behavior

axolotl should be installed.

Current behaviour

When I ran the installation command on Google Colab (L4 GPU): !git clone https://github.com/axolotl-ai-cloud/axolotl.git !cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'

It yields:

ERROR: Cannot install None and axolotl because these package versions have conflicting dependencies.

The conflict is caused by: axolotl 0.4.1 depends on torch==2.4.1+cu121 accelerate 0.34.2 depends on torch>=1.10.0 bitsandbytes 0.44.0 depends on torch liger-kernel 0.3.0 depends on torch>=2.1.2 optimum 1.16.2 depends on torch>=1.11 peft 0.13.0 depends on torch>=1.13.0 trl 0.9.6 depends on torch>=1.4.0 xformers 0.0.27 depends on torch==2.3.1

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip to attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

Steps to reproduce

Go on Google Colab. Select a GPU instance, e.g. L4 And run the installation commands !git clone https://github.com/axolotl-ai-cloud/axolotl.git !cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'

Config yaml

No response

Possible solution

Update the requirements.

Which Operating Systems are you using?

Python Version

3.10

axolotl branch-commit

main

Acknowledgements

nicokim commented 1 month ago

Same thing happened to me on runpod. I updated the version of the transformers and xformers in the requirements.txt

xformers==0.0.28 transformers==4.45.1

and it worked somehow haha! Can you try it?

benjamin-marie commented 1 month ago

Yes, it works. The requirements must be updated. I expect new installations of Axolotl to be impossible until it's fixed.

Hasan-Demez commented 1 month ago

i also ran across the same issue in vscode and the same solution worked for me as well, thank you.

NanoCode012 commented 1 month ago

Hey, sorry for this. We're trying to make things more clear on this. Colab's version of torch may be incompatible with the specific version required by xformers. Previously, you should install torch==2.3.1 and now 2.4.1 for xformers to be happy. Currently, Colab uses 2.4.1, so this issue should not exist.

I plan to remove dependency on xformers out as it's creating this locking issue.

Amit30swgoh commented 1 month ago

Python 3.10.12

pip install torch==2.4.1+cu124 torchvision==0.19.1+cu124 torchaudio==2.4.1+cu124 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu124 pip install xformers==0.0.28.post1 pip install torch==2.4.0+cu124 torchvision==0.19.0+cu124 torchaudio==2.4.0+cu124 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu124 pip install xformers==0.0.27.post2 pip install torch==2.3.1+cu121 torchvision==0.18.1+cu121 torchaudio==2.3.1+cu121 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu121 pip install xformers==0.0.27

Enjoy !

NanoCode012 commented 4 weeks ago

I updated the colab example in the linked PR. It uses the instructions on the readme which installs fine. I would recommend anyone having this issue to check it out. Sorry for any confusion with this.

bursteratom commented 1 day ago

rebased the old PR into the new PR here:

https://github.com/axolotl-ai-cloud/axolotl/pull/2074