Closed tmceld closed 5 months ago
It does. You can verify this by setting something like temperature=0
which will produce the same output each time
Firstly, thank you for your reply, but I am still not any wiser - i was setting something like temperature=0
i was passing it as a dictionary options={"temperature": 0.1}
I wonder, in terms of a chat call, where i set it?
@tmceld I found an example here: https://github.com/ollama/ollama-python/blob/main/examples/fill-in-middle/main.py#L16
thanks @joelewing i had found examples on the generate endpoint, but i was wondering about the chat endpoint
Hi,
I'm trying to keep hallucinations down, so was playing around with
temperature
top_p
andtop_k
:foo = ollama.chat(model='llama2', options={"temperature": 0.1, "top_p": 0.10, "top_k": 1}, messages=[{'role': 'system', 'content': systemStr},
but finding no discernable difference, when it struck me that maybe chat, unlike generate, doesn't take options?
is this the case? I am using chat as i want to send
messages