withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
838 stars 82 forks source link

Passing grammars #9

Closed arthurwolf closed 1 year ago

arthurwolf commented 1 year ago

I don't find a way to pass along a grammar when setting things up, the way you do for llama.cpp

Am I missing something, or is this just not supported yet?

I need it to force the model to generate JSON, which is extra convenient when running a node-based system...

Thanks!

arthurwolf commented 1 year ago

Thank you so much @giladgd !

github-actions[bot] commented 1 year ago

:tada: This issue has been resolved in version 2.1.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket: