Closed sabagithub closed 1 year ago
Try to run this inside git-bash https://git-scm.com/
What hardware do you have?
By the way, the nvm installation is primarily for Linux users, for windows the official installation is good enough.
Thanks for the quick response. I am using Windows 10 on an intel I7 6700 with 16GB of ram, and GTX 980 4GB. I installed everything via the windows cmd line.
I tried it in the git-bash
console, and I can access Catai via catai
commands (without running each command manually). I no longer get the MODULE_NOT_FOUND. Thanks!
However, I when type catai serve
(still in git-bash) I get the 'nGpuLayers' error.
sabagithub@DESKTOP-SABA MINGW64 ~ $ catai serve $ cd C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai $ npm start -- --production true --ui catai catai@0.3.8 start node src/index.js --production true --ui catai
Listening on http://127.0.0.1:3000 file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63 this.instance = yield LLama.load(path, rest, enableLogging); ^
Error: Missing field
nGpuLayers
at LLamaCpp.(file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63:35) at Generator.next ( ) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:33:61 at new Promise ( ) at __async (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:17:10) at LLamaCpp.load (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:61:12) at LLM.load (C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai\node_modules\llama-node\dist\index.cjs:52:21) at #addNew (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:88:21) at new NodeLlamaActivePull (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:19:38) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/node-llama.js:8:48 { code: 'InvalidArg' } Node.js v20.2.0 file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63 this.instance = yield LLama.load(path, rest, enableLogging); ^
Error: Missing field
nGpuLayers
at LLamaCpp.(file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63:35) at Generator.next ( ) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:33:61 at new Promise ( ) at __async (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:17:10) at LLamaCpp.load (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:61:12) at LLM.load (C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai\node_modules\llama-node\dist\index.cjs:52:21) at #addNew (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:88:21) at new NodeLlamaActivePull (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:19:38) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/node-llama.js:8:48 { code: 'InvalidArg' } Node.js v20.2.0 at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/scripts/cli.js:69:27 exit code: 1
Fix in the latest version, update with catai update
Thanks Ido-pluto, it works now!
Hi, I'm new to using node, so I'm not sure what's going on. I can't use catai because it says it doesn't find my node installation.
Steps:
catai list
, and it fails.catai models
works, and listed all available models, andcatai install Stable-Vicuna-13B
downloads, but it failed when it tried to use the model.catai list
, and it still fails.Note: I can go to
C:\Users\sabsa\AppData\Roaming\nvm\v20.2.0\node_modules\catai\scripts
manually, and runnpm run list
and it will say "No model downloaded", so it works if I run it manually. Note 2: so knowing this, I download Stable Vicuna-13B again, and rannpm run use Stable-Vicuna-13B
in the node_modules/catai/scripts directory manually, and it worked. I then rannpm start -- --production true --ui catai
. it started a server, but then failed withError: Missing field 'nGpuLayers'
so I don't know if that's happening because I didn't start Catai the correct way, or if llamacpp was updated and llama-node is out of date?error message: