ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

Setting up top k , Max tokens , context length ? #87

Closed timtensor closed 3 months ago

timtensor commented 3 months ago

Hi I am using an ollama chat model to call Mistral-model . How can one set the different parameters such as top k , top p, context, length and temperature ?

T

mxyng commented 3 months ago

You can set it with the options keyword parameter, e.g. ollama.chat(..., options={...}). There's an example here