cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.11k stars 1.43k forks source link

Running Alpaca-7B on Ubuntu but can't load Alpaca-13B or Llama #411

Open MPCSbill opened 1 year ago

MPCSbill commented 1 year ago

Hi, Great project! I was working with AI back in the 80's and it has come a LONG way!

Here is my problem, I am running: Ubuntu 22.10 Python 3.10.7 . Node.js 18 Intel(R) Core(TM) i7-4820K CPU @ 3.70GHz CPU 32GB System memory ZOTAC GeForce RTX 3060 Twin Edge OC 12GB GDDR6 192-bit 15 Gbps PCIE 4.0 card GPU 1TB SSD with 900GB free

I have Dalai Alpaca-7B installed and running OK (Although it may be slower than it should be, but it is very accurate) When I try to install the Alpaca-13B model with npx dalai alpaca install 13B looks like it fails here with a 404?:

g++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread quantize.cpp ggml.o utils.o -o quantize mpcs@mpcs-ai:~/dalai/alpaca$ exit exit alpaca.add [ '13B' ] dir /home/mpcs/dalai/alpaca/models/13B downloading torrent ggml-model-q4_0.bin 0% / ggml-model-q4_0.bin 100% [================================================>] doneERROR AxiosError: Request failed with status code 404

I have a different problem if I try install Llama, even with the smaller model npx dalai llama install 7B

It just runs and hangs at consolidated.00.pth 39%[===============================> ] in 85 hours

I let it run overnight and it does not move.

I tried the fix: sudo apt-get install build-essential python3-venv -y also apt-get install cmake g++

I think I have enough horse power and I don't see any errors. Any thoughts or ideas on either problem would be greatly appreciated. Thanks in advance.

MrAnayDongre commented 1 year ago

For alpaca 7B model you can follow the link. This is the only direct download I could find.

twerpyfie commented 1 year ago

Download the Files manually and place it like this: grafik

Try these: alpaca7B, alpaca13B