withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://withcatai.github.io/node-llama-cpp/
MIT License
760 stars 65 forks source link

feat(minor): add more flags to the `chat` command #147

Closed giladgd closed 6 months ago

giladgd commented 6 months ago

Description of change

Based on #145 by @stewartoallen, adapted for the beta branch.

Pull-Request Checklist

giladgd commented 6 months ago

Merged to master by mistake, reverting