withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

fix: bugs #80

Closed giladgd closed 10 months ago

giladgd commented 10 months ago

Description of change

Fixes #78 Fixes #79 Fixes #58 Reintroduces #74 - added documentation on how to resolve it (link)

Pull-Request Checklist

github-actions[bot] commented 10 months ago

:tada: This PR is included in version 2.7.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket: