Closed Vold-Hal closed 4 days ago
Probably should have added - I am running Ollama locally through wsl
@Vold-Hal , using Ollama with OpenAI is a not supported scenario. Please use our Ollama
connector.
Suggest also shifting to the new FunctionCalling Abstractions.
executionSettings: new OllamaPromptExecutionSettings { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto() }
Let me know if that worked for you.
Thanks!
@Vold-Hal Function Invocation is currently not supported by Ollama for Streaming APIs, once this feature becomes available and is added as part of the OllamaSharp library implementation our Connector will be able to trigger function calling for streaming outputs.
See:
Describe the bug When using
GetStreamingChatMessageContentsAsync()
with Ollama models, the response is returned as a JSON object containing the function invocation details (name and parameters), but the tool function is not automatically invoked.GetChatMessageContentAsync()
usually triggers the plugin function. The issue may be caused by the way responses are segmented and handled in streaming mode, potentially leading non-executed function calls.To Reproduce
Expected behavior A function should be called, instead it just outputs "{"name": "EmbeddingPlugin-get_vector_from_string", "parameters": {"text": "example text"}}" in chat
Platform