ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.75k stars 227 forks source link

Does chat take options? #47

Closed tmceld closed 5 months ago

tmceld commented 5 months ago

Hi,

I'm trying to keep hallucinations down, so was playing around with temperature top_p and top_k:

foo = ollama.chat(model='llama2', options={"temperature": 0.1, "top_p": 0.10, "top_k": 1}, messages=[{'role': 'system', 'content': systemStr},

but finding no discernable difference, when it struck me that maybe chat, unlike generate, doesn't take options?

is this the case? I am using chat as i want to send messages

mxyng commented 5 months ago

It does. You can verify this by setting something like temperature=0 which will produce the same output each time

tmceld commented 5 months ago

Firstly, thank you for your reply, but I am still not any wiser - i was setting something like temperature=0 i was passing it as a dictionary options={"temperature": 0.1} I wonder, in terms of a chat call, where i set it?

joelewing commented 4 months ago

@tmceld I found an example here: https://github.com/ollama/ollama-python/blob/main/examples/fill-in-middle/main.py#L16

tmceld commented 4 months ago

thanks @joelewing i had found examples on the generate endpoint, but i was wondering about the chat endpoint