simonw / llm-llama-cpp

LLM plugin for running models using llama.cpp
Apache License 2.0
136 stars 19 forks source link

Option to change max_tokens #20

Closed simonw closed 11 months ago

simonw commented 11 months ago

I'm going to use this as the default and add an option to change it.

Originally posted by @simonw in https://github.com/simonw/llm-llama-cpp/issues/18#issuecomment-1738371727

simonw commented 11 months ago

Released in https://github.com/simonw/llm-llama-cpp/releases/tag/0.2b1