Closed drschwabe closed 8 months ago
@drschwabe node-llama-cpp
CLI works with Bun.
bunx
doesn't support the --no
parameter, so you should omit it when using bunx
.
You should use the CLI like this:
bunx --bun node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf
Issue description
CLI does not work with Bun
Expected Behavior
bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf
model should load correctly and enter a chat
Actual Behavior
bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf
Returns: Not enough non-option arguments: got 0, need at least 1
Steps to reproduce
bun install node-llama-cpp bunx --no node-llama-cpp chat --model ./models/llama-2-7b.Q4_K_M.gguf
My Environment
Ubuntu 22 | "node-llama-cpp": "^2.8.5",
Additional Context
It would be great if this module works with bun
bun is super fast and let's us also use commonjs modules in the same script alongside esm import-only modules like node-llama-cpp
Relevant Features Used
Are you willing to resolve this issue by submitting a Pull Request?
Yes, I have the time, and I know how to start.