Closed cpsievert closed 3 weeks ago
https://ollama.com/blog/tool-support claims that their OpenAI compatible backend supports tool calling.
...
It worked for me once:
> chat$chat("What's the current date in YMD format?")
[1] "The current date is 1 January 2024."
But not again 😬
Reprex for current version:
library(elmer)
current_date <- function() "2024-01-01"
chat <- chat_ollama(model = "llama3.2", echo = FALSE)
chat$register_tool(tool(current_date, "Compute the current date"))
chat$chat("What's the current date in YMD format?")
Hmmmm, but looking at the chat history I see:
<Chat turns=4 tokens=243/302>
── user ──────────────────────────────────────────────────────────────────────────────────────────────────
What's the current date in YMD format?
── assistant ─────────────────────────────────────────────────────────────────────────────────────────────
[tool request (call_cipg5r79)]: current_date(format = "YMD")
── user ──────────────────────────────────────────────────────────────────────────────────────────────────
[tool result (call_cipg5r79)]:
So it's correctly generating the tool call, but giving it the wrong arguments.
This works:
library(elmer)
current_date <- function(...) "2024-01-01"
chat <- chat_ollama(model = "llama3.2", echo = FALSE)
chat$register_tool(tool(current_date, "Compute the current date"))
chat$chat("What's the current date in YMD format?")
#> [1] "The current date in YMD format is 2024-01-01."
Created on 2024-10-28 with reprex v2.1.0
Oh, ha. Not sure what went wrong for me, but good to know! I'll update the title to reflect it's only streaming tools that don't work, but I think that's a fundamental issue with the ollama executable.
FWIW, I saw similar behavior with Azure OpenAI. Streaming behavior caused tool calling to break. It seemed to depend on Azure OpenAI API version, but I didn't understand why.
This also works:
library(elmer)
current_date <- function() "2024-01-01"
chat <- chat_ollama(model = "mistral-nemo", echo = FALSE)
chat$register_tool(tool(current_date, "Compute the current date"))
chat$chat("What's the current date in YMD format?")
So I think that the problem is mostly a limitation of the ollama models. I'll add a note to the docs.
I don't know for sure, but it smells like this a limitation/bug specific to the openai compatibility endpoint.
It's also worth noting though that streaming tools don't appear to work at all with any Ollama model at the moment. See https://github.com/ollama/ollama/issues/5796 and https://github.com/ollama/ollama/pull/6452