withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
893 stars 86 forks source link

CLI does not work with Bun #146

Closed drschwabe closed 8 months ago

drschwabe commented 8 months ago

Issue description

CLI does not work with Bun

Expected Behavior

bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf

model should load correctly and enter a chat

Actual Behavior

bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf

Returns: Not enough non-option arguments: got 0, need at least 1

Steps to reproduce

bun install node-llama-cpp bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf

My Environment

Ubuntu 22 | "node-llama-cpp": "^2.8.5",

Additional Context

It would be great if this module works with bun

bun is super fast and let's us also use commonjs modules in the same script alongside esm import-only modules like node-llama-cpp

Relevant Features Used

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

giladgd commented 8 months ago

@drschwabe node-llama-cpp CLI works with Bun. bunx doesn't support the --no parameter, so you should omit it when using bunx.

You should use the CLI like this:

bunx --bun node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf