withcatai / catai

Run AI ✨ assistant locally! with simple API for Node.js 🚀
https://withcatai.github.io/catai/
MIT License
457 stars 30 forks source link

Error: Missing field `nGpuLayers` #17

Closed m0dE closed 1 year ago

m0dE commented 1 year ago

I'm trying to run Wizard-Vicuna-13B-Uncensored model on a VM (16GB RAM), but i'm getting the below error:

Error: Missing field nGpuLayers at LLamaCpp. (file:///usr/local/lib/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63:35) at Generator.next () at file:///usr/local/lib/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:33:61 at new Promise () at __async (file:///usr/local/lib/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:17:10) at LLamaCpp.load (file:///usr/local/lib/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:61:12) at LLM.load (/usr/local/lib/node_modules/catai/node_modules/llama-node/dist/index.cjs:52:21) at #addNew (file:///usr/local/lib/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:88:21) at new NodeLlamaActivePull (file:///usr/local/lib/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:19:38) at file:///usr/local/lib/node_modules/catai/src/alpaca-client/node-llama/node-llama.js:8:48 { code: 'InvalidArg' }

celisum commented 1 year ago

+1

ido-pluto commented 1 year ago

Currently checking this

ido-pluto commented 1 year ago

Fix in the lastest version, use catai update to upgrade