withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
893 stars 86 forks source link

Bun support #152

Closed mnlttt closed 8 months ago

mnlttt commented 8 months ago

Feature Description

Currently, running the library using Bun throws an error:

CleanShot 2024-01-29 at 10 10 56@2x

Is this Bun's fault or something else?

The Solution

It should run like NodeJS

Considered Alternatives

Well, don't use Bun?

Additional Context

No response

Related Features to This Feature Request

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

giladgd commented 8 months ago

@mnlttt From my tests, node-llama-cpp works great on bun, so I think there's some misconfiguration in your environment, or your bun version is not up-to-date