microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.33k stars 3.14k forks source link

.Net: Test local modal with Semantic Kernel (i.e., Llama via Ollama) #3990

Closed matthewbolanos closed 9 months ago

matthewbolanos commented 9 months ago

Deploy a IChatCompletionService Llama model locally with Ollama and validate that it works with Semantic Kernel and the existing IChatCompletionService interface.

matthewbolanos commented 9 months ago

Doesn't need to test function calling.

alliscode commented 9 months ago

I tested our IChatCompletionService with Ollama using mistral. My conclusion is that our abstractions are good enough to allow this to work, but it does not currently work due to some implementation details in the Azure OpenAI SDK:

Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.

stephentoub commented 8 months ago

Once these issues are fixed in the Azure OpenAI SDK, our IChatCompletionService will support Ollama.

Are there issues open on that for Azure.AI.OpenAI? Is anyone working on it? Timeframe?

clement128 commented 8 months ago

hello any update for this?