Closed matthewbolanos closed 11 months ago
Doesn't need to test function calling.
I tested our IChatCompletionService with Ollama using mistral. My conclusion is that our abstractions are good enough to allow this to work, but it does not currently work due to some implementation details in the Azure OpenAI SDK:
streaming=false
in the request body, which is never done in the Azure OpenAI SDK, it's either true
or null
(missing).Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.
Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.
Are there issues open on that for Azure.AI.OpenAI? Is anyone working on it? Timeframe?
hello any update for this?
Deploy a
IChatCompletionService
Llama model locally with Ollama and validate that it works with Semantic Kernel and the existingIChatCompletionService
interface.