BLaZeKiLL / Codeblaze.SemanticKernel

Some Cool Semantic Kernel Plugins
MIT License
44 stars 9 forks source link

Enhance chat completion instead of using simple completion. #9

Closed kbeaugrand closed 7 months ago

kbeaugrand commented 7 months ago

Motivation

I want to use correctly the Chat completion with Ollama models.

Context

The current implementation is using the simple generation api as backend of the chat completion. I changed that by using the chat completion API instead.

BLaZeKiLL commented 7 months ago

Woah, thanks for this, for some reason, I didn't notice at all the ollama had a /api/chat endpoint

kbeaugrand commented 7 months ago

Woah, thanks for this, for some reason, I didn't notice at all the ollama had a /api/chat endpoint

Thank you for the commit, I'm preparing a release with my assistant to leverage on your connector and provide assistant with Ollama models hosted... Stay tuned