withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://withcatai.github.io/node-llama-cpp/
MIT License
760 stars 65 forks source link

feat(minor): add `--batchSize` and `--promptFile` options to the `chat` CLI command #145

Closed stewartoallen closed 6 months ago

stewartoallen commented 6 months ago

Adds feature parity with llama.cpp's main

--promptFile for testing of larger initial prompts including embedding text --batchSize to override the default batchSize

giladgd commented 6 months ago

@stewartoallen Thanks for the PR! I've opened another PR based on your changes for the beta branch since I think it'd be more beneficial to have these changes in the version 3 beta