Open edgararuiz opened 6 days ago
Ok, this will require creating an ollama subclass and reimplementing the chat_request()
method.
Actually I don't think there is any need to create a subclass for ollama. There are actually two ollama API and elmer
is using http://localhost:11434/v1/chat/completions but the API doc that OP is referring to is http://localhost:11434/api/generate. The former mimics the OpenAI API so there is no need to nest the seed under options unless you are using the latter (I wasted hours figuring this out before).
The reprex is just missing another call to ollama and should be like below. The seed is working on my end.
chat <- elmer::chat_ollama(model = "llama3.2", seed = 100)
chat$chat("hello")
chat <- elmer::chat_ollama(model = "llama3.2", seed = 100) ##<< this is missing in the reprex
chat$chat("hello")
It looks like
seed
is not working when using it inchat()
, I get no consistent responses when setting it. I also ran the same test withollamar
and did receive consistent results:Created on 2024-11-19 with reprex v2.1.0
The root cause may be that
elmer
is using the exact same REST call used for OpenAI:https://github.com/tidyverse/elmer/blob/848fd2f0739a73863c86246a3e1f8fd95b73d4ef/R/provider-openai.R#L129-L139 In that API, the seed is set at the same level as messages (https://platform.openai.com/docs/api-reference/chat/create), but in Ollama, seed goes under theoptions
section: https://github.com/ollama/ollama/blob/main/docs/api.md#generate-request-with-options.