Closed alliscode closed 4 months ago
If you redo the connectors.AI.ollama, please also think about providing some basic OpenAIPromptExecutionSettings compatibility for OllamaAIPromptExecutionSettings. Currently e.g. "temperature" with OpenAIPromptExecutionSettings becomes "options"."temperature" with OllamaPromptExecutionSettings. This is e.g. breaking the SequentialPlanning settings from python/semantic_kernel/planners/sequential_planner/Plugins/SequentialPlanning/config.json. Yes, I know that Ollama uses "options", but I think exposing this difference to SK plugins,... is bad for the most common parameters like:
Add support for Ollama using the Python Ollama SDK. This is separate from Ollama support using the OpenAI Connector.