Open hhughes opened 11 months ago
Motivation: I wanted to be able to tune temperature, top_k and top_p when using llama via llm
package. I copied almost all current params wholesale, if you think some do not belong I'm happy to adjust the CL.
I left out stopping_criteria
, logits_processor
and grammar
because pydantic didn't like their types (and I don't need them for now).
Defaults set to current values in llama-cpp-python. Descriptions taken from llama-cpp-python documentation where defined.
New parameters: