cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.1k stars 1.42k forks source link

Run alpaca.7B exception #429

Open roomyxie opened 1 year ago

roomyxie commented 1 year ago

[root@VM-19-3-tencentos ~]# npx dalai serve mkdir /root/dalai Server running on http://localhost:3000/

query: { method: 'installed', models: [] } modelsPath /root/dalai/alpaca/models { modelFolders: [ '7B' ] } exists 7B modelsPath /root/dalai/llama/models { modelFolders: [] } query: { seed: -1, threads: 4, n_predict: 200, top_k: 40, top_p: 0.9, temp: 0.8, repeat_last_n: 64, repeat_penalty: 1.3, debug: false, models: [ 'alpaca.7B' ], prompt: 'The expected response for a highly intelligent chatbot to ">PROMPT" is \n"', id: 'TS-1682941922036-71204' } /root/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219 let [Core, Model] = req.model.split(".") ^

TypeError: Cannot read properties of undefined (reading 'split') at Dalai.query (/root/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:35) at Socket. (/root/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:534:20) at Socket.emit (node:events:511:28) at Socket.emitUntyped (/root/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22) at /root/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39 at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v20.0.0

ghost commented 1 year ago

User ignis32 has come up with this current work around / solution

UPD: Okay, looks like i've figured it out. Seems like something is getting corrupted in the browser site stored data. Clearing browser cookies/data/localstorage/whatever helps.