xNul / chat-llama-discord-bot

A Discord Bot for chatting with LLaMA, Vicuna, Alpaca, MPT, or any other Large Language Model (LLM) supported by text-generation-webui or llama.cpp.
https://discord.gg/TcRGDV754Y
MIT License
118 stars 23 forks source link

Can't get cuda-toolkit to install on Mac Ventura 13.3 #11

Closed alexdwagner closed 1 year ago

alexdwagner commented 1 year ago

Perhaps this it's way over my head to be attempting to run this repo without being a dev, but this seemed like the best way to contact you, so here goes.

The terminal command below doesn't work for me:

conda create -n textgen python=3.10.9 torchvision torchaudio pytorch-cuda=11.7 cuda-toolkit conda-forge::ninja conda-forge::git -c pytorch -c nvidia/label/cuda-11.7.0 -c nvidia

When I run this, I get:

Solving environment: failed

PackagesNotFoundError: The following packages are not available from current channels:

  - cuda-toolkit

Any ideas on how to fix?

xNul commented 1 year ago

Yeah I'm sure it's complicated for someone who's not a dev lol. I'm not sure why it's not working for you, but luckily a new one-click installer was just released over here. It should make things easier.

It'll ask you what brand of GPU you have (or no GPU) then you select the letter corresponding to it. When you get to the model downloading step, you'll select the letter for downloading another model (the letter L) and enter decapoda-research/llama-7b-hf, or the model of your choice.

Make sure you're able to run and chat with the model using text-generation-webui before trying to use the bot, but once you're able to, just follow steps 2-5 here and you should be good to go!

Let me know if you run into anymore issues! There's an invite to a Discord btw and I'm in there too.

alexdwagner commented 1 year ago

Thank you so much! I'm running the installer tonight, downloading the model right now. I joined the Discord too. I really appreciate your reply. I'm excited to try out the one-click installer!

alexdwagner commented 1 year ago

Also, you probably already know this, but for anyone who stumbles across this issue–CUDA only runs on NVIDIA graphics cards, which do not work with Apple computers.