withcatai / catai

Run AI ✨ assistant locally! with simple API for Node.js 🚀
https://withcatai.github.io/catai/
MIT License
452 stars 30 forks source link

Catastrophic - CatAI is completely broken (Segmentation fault core dumped fatal error) #30

Closed TheSystemGuy1337 closed 1 year ago

TheSystemGuy1337 commented 1 year ago

Spinning off into it's own issue.

Please refer to the troubleshooting before opening an issue. You might find the solution there.

Describe the bug CatAI after updating with the advice on issue #29 now no longer starts up at all, and tosses a segmentation fault. The errors produced when trying to start CatAI are the following:

catai@1.0.2 start node src/index.js --production true --ui catai

Segmentation fault (core dumped)

Using catai update results into a different error, but same outcome

fatal runtime error: Rust cannot catch foreign exceptions Aborted (core dumped) fatal runtime error: Rust cannot catch foreign exceptions Aborted (core dumped) at file:///home/thesystemguy/.nvm/versions/node/v20.2.0/lib/node_modules/catai/scripts/cli.js:69:27 exit code: 134 (Process aborted)

Reinstallation does the same thing as using catai update. This is a showstopper problem.

Desktop (please complete the following information): OS: Linux Mint 21.2 Cinnamon, Linux 5.15.0-76-generic Browser: n/a (no start) CatAI version 1.0.2 (as advised in issue #29) Node.js version v18.16.0 CPU: AMD Ryzen 5 5600H with Radeon Graphics RAM: 30.7 GiB (512 MiB reserved to graphics chipset)

ido-pluto commented 1 year ago

Version 1, is unstable and therefore in preview. You can downgrade to version 0 with:

npm -g i catai@0

Then install one of the support models in this version

TheSystemGuy1337 commented 1 year ago

What about 0.9? I want the last version before 1.0 as I use GPT4free

TheSystemGuy1337 commented 1 year ago

Downgrading does not help. I'm still getting the same error, even after downgrading straight to 0.3.12

ido-pluto commented 1 year ago

That may be because you use an unsupported model, use 'catai install' to install one of the supported models.

Probably the selected model is the model from version 1, you can do: 'catai use Vicuna-7B' To change the selected model

pavelpiha commented 1 year ago

I'm facing same error on local run. I use Windows 10, node.js v19.9.0 catai commit 4bed6183b764fce9b353b750c595a0f687dc85ec

/d/demo/catai/server $ npm start
> catai@1.0.1 start
> node src/index.js

/c/Program Files/nodejs/npm: line 44:  3625 Segmentation fault      "$NODE_EXE" "$NPM_CLI_JS" "$@"
TheSystemGuy1337 commented 1 year ago

This software is more or less abandonware and isn’t being properly supported. Migrate to LocalAI or AutoGPT4allOn Jul 23, 2023, at 5:41 PM, pavelpiha @.***> wrote: I'm facing same error on local run. I use Windows 10, node.js v19.9.0 catai commit 4bed618 $ npm start

@.*** start node src/index.js

/c/Program Files/nodejs/npm: line 44: 3625 Segmentation fault "$NODE_EXE" "$NPM_CLI_JS" "$@"

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***>

ido-pluto commented 1 year ago

The library that binds CatAI with the model, stopped getting updates (node-llama). Now I am implementing, something like that, but this can take a while. For the meantime only old GGML will be supported :(

So catai version 1 and above is not recommended right now.

Use npm i -g catai@0 to downgrade

pavelpiha commented 1 year ago

Thank you for the response. Perhaps I'm trying to run it locally from cloned repository as written in Development Could you possibly suggest which commit should I checkout?

ido-pluto commented 1 year ago

You only need to downgrade the node-llama packages to version 1.5, and this should work

pavelpiha commented 1 year ago

Yes, Just did it. Fixed it by reverting package-lock.json in catai/server. Looks like problem in dependencies.

TheSystemGuy1337 commented 1 year ago

This software is abandonware and relies on complicated package nonsense. Please consider using C++ or x86 assembly instead of a volatile language like Python. Might be painful, but at least it guarantees it actually bloody works instead of breaking CPU NX policies and segfaulting into /dev/null.