withcatai / catai

Run AI ✨ assistant locally! with simple API for Node.js 🚀
https://withcatai.github.io/catai/
MIT License
457 stars 30 forks source link

Windows NodeJS MODULE_NOT_FOUND issue. #16

Closed sabagithub closed 1 year ago

sabagithub commented 1 year ago

Hi, I'm new to using node, so I'm not sure what's going on. I can't use catai because it says it doesn't find my node installation.

Steps:

  1. I already had node installed, so installed catai. I open cmd and runcatai list, and it fails.
  2. catai models works, and listed all available models, and catai install Stable-Vicuna-13B downloads, but it failed when it tried to use the model.
  3. I uninstalled catai, remove the catai data directory, I removed node, installed nvm for windows, and install the current nodejs. re-install catai. run catai list, and it still fails.

Note: I can go to C:\Users\sabsa\AppData\Roaming\nvm\v20.2.0\node_modules\catai\scripts manually, and run npm run list and it will say "No model downloaded", so it works if I run it manually. Note 2: so knowing this, I download Stable Vicuna-13B again, and ran npm run use Stable-Vicuna-13B in the node_modules/catai/scripts directory manually, and it worked. I then ran npm start -- --production true --ui catai. it started a server, but then failed with Error: Missing field 'nGpuLayers' so I don't know if that's happening because I didn't start Catai the correct way, or if llamacpp was updated and llama-node is out of date?

error message:

C:\Users\sabagithub>catai list $ cd C:\Users\sabagithub\AppData\Roaming\npm\node_modules\catai $ npm run list node:net:426 throw errnoException(err, 'open'); ^

Error: open EISDIR at new Socket (node:net:426:13) at createWritableStdioStream (node:internal/bootstrap/switches/is_main_thread:80:18) at process.getStdout [as stdout] (node:internal/bootstrap/switches/is_main_thread:150:12) at console.get (node:internal/console/constructor:209:42) at console.value (node:internal/console/constructor:337:50) at console.log (node:internal/console/constructor:376:61) at runScript (node:internal/process/execution:94:7) at evalScript (node:internal/process/execution:104:10) at node:internal/main/eval_string:50:3 { errno: -4068, code: 'EISDIR', syscall: 'open' }

Node.js v20.2.0 node:internal/modules/cjs/loader:1073 throw err; ^

Error: Cannot find module 'C:\node_modules\npm\bin\npm-cli.js' at Module._resolveFilename (node:internal/modules/cjs/loader:1070:15) at Module._load (node:internal/modules/cjs/loader:923:27) at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12) at node:internal/main/run_main_module:23:47 { code: 'MODULE_NOT_FOUND', requireStack: [] }

Node.js v20.2.0 Could not determine Node.js install directory node:net:426 throw errnoException(err, 'open'); ^

Error: open EISDIR at new Socket (node:net:426:13) at createWritableStdioStream (node:internal/bootstrap/switches/is_main_thread:80:18) at process.getStdout [as stdout] (node:internal/bootstrap/switches/is_main_thread:150:12) at console.get (node:internal/console/constructor:209:42) at console.value (node:internal/console/constructor:337:50) at console.log (node:internal/console/constructor:376:61) at runScript (node:internal/process/execution:94:7) at evalScript (node:internal/process/execution:104:10) at node:internal/main/eval_string:50:3 { errno: -4068, code: 'EISDIR', syscall: 'open' }

Node.js v20.2.0 node:internal/modules/cjs/loader:1073 throw err; ^

Error: Cannot find module 'C:\node_modules\npm\bin\npm-cli.js' at Module._resolveFilename (node:internal/modules/cjs/loader:1070:15) at Module._load (node:internal/modules/cjs/loader:923:27) at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12) at node:internal/main/run_main_module:23:47 { code: 'MODULE_NOT_FOUND', requireStack: [] }

Node.js v20.2.0 Could not determine Node.js install directory at file:///C:/Users/sabagithub/AppData/Roaming/npm/node_modules/catai/scripts/cli.js:62:27 exit code: 1

ido-pluto commented 1 year ago

Try to run this inside git-bash https://git-scm.com/

What hardware do you have?

By the way, the nvm installation is primarily for Linux users, for windows the official installation is good enough.

sabagithub commented 1 year ago

Thanks for the quick response. I am using Windows 10 on an intel I7 6700 with 16GB of ram, and GTX 980 4GB. I installed everything via the windows cmd line.

I tried it in the git-bash console, and I can access Catai via catai commands (without running each command manually). I no longer get the MODULE_NOT_FOUND. Thanks!

However, I when type catai serve (still in git-bash) I get the 'nGpuLayers' error.

sabagithub@DESKTOP-SABA MINGW64 ~ $ catai serve $ cd C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai $ npm start -- --production true --ui catai catai@0.3.8 start node src/index.js --production true --ui catai

Listening on http://127.0.0.1:3000 file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63 this.instance = yield LLama.load(path, rest, enableLogging); ^

Error: Missing field nGpuLayers at LLamaCpp. (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63:35) at Generator.next () at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:33:61 at new Promise () at __async (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:17:10) at LLamaCpp.load (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:61:12) at LLM.load (C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai\node_modules\llama-node\dist\index.cjs:52:21) at #addNew (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:88:21) at new NodeLlamaActivePull (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:19:38) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/node-llama.js:8:48 { code: 'InvalidArg' }

Node.js v20.2.0 file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63 this.instance = yield LLama.load(path, rest, enableLogging); ^

Error: Missing field nGpuLayers at LLamaCpp. (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:63:35) at Generator.next () at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:33:61 at new Promise () at __async (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:17:10) at LLamaCpp.load (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/node_modules/llama-node/dist/llm/llama-cpp.js:61:12) at LLM.load (C:\Users\sabagithub\AppData\Roaming\nvm\v20.2.0\node_modules\catai\node_modules\llama-node\dist\index.cjs:52:21) at #addNew (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:88:21) at new NodeLlamaActivePull (file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/process-pull.js:19:38) at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/src/alpaca-client/node-llama/node-llama.js:8:48 { code: 'InvalidArg' }

Node.js v20.2.0 at file:///C:/Users/sabagithub/AppData/Roaming/nvm/v20.2.0/node_modules/catai/scripts/cli.js:69:27 exit code: 1

ido-pluto commented 1 year ago

Fix in the latest version, update with catai update

sabagithub commented 1 year ago

Thanks Ido-pluto, it works now!