cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.09k stars 1.42k forks source link

Alpaca.7B didn't execute #427

Open EvanYLLu opened 1 year ago

EvanYLLu commented 1 year ago

Operating System: Mac CPU: M1

I have problems same to #372 too. I follow the steps of #373. It fix "TypeError: Cannot read properties of undefined (reading 'split')", but alpaca didn't execute. The state is stuck on sending the task instruction, then no response. How to solve this?

> query: {
  seed: -1,
  threads: 4,
  n_predict: 200,
  top_k: 40,
  top_p: 0.9,
  temp: 0.8,
  repeat_last_n: 64,
  repeat_penalty: 1.3,
  debug: false,
  models: [ 'alpaca.7B' ],
  prompt: 'how many an apple?',
  id: 'TS-1682655307819-94181',
  model: 'alpaca.7B'
}
{ Core: 'alpaca', Model: '7B' }
exec: /Users/evanlu/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "how many an apple?" in /Users/evanlu/dalai/alpaca
tfmertz commented 1 year ago

Same here. I ran the instructions for docker:

docker compose build
docker compose run dalai npx dalai alpaca install 7B
docker compose up

then went to the web server on 127.0.0.1:3000 and tried to enter any prompt and it immediately returned the prompt back to me in the list at the bottom and the button was spinning on cancel indefinitely.

I installed llama docker compose run dalai npx dalai llama install 7B and picked it from the drop down and everything worked great. alpaca 7B just seemed to hang indefinitely (or never start?).

Here's a picture of alpaca vs llama both 7B: alpaca_v_llama

junxian428 commented 1 year ago

Mine works but outcome cannot image

junxian428 commented 1 year ago

the outcome is not desirable maybe I use Llama 7B model