Closed peperunas closed 4 weeks ago
I commented on your PR as well, but there are issues with Object generation and Tool streaming in ollama, so I think this is related.
https://github.com/sgomez/ollama-ai-provider?tab=readme-ov-file#tested-models-and-capabilities
These issues are also present in other models like Groq, so it might be a good idea to start by using the Ollama model only for the writer agent.
PR: #214 Issue: #215
Feature Description
It would be great to support Ollama in addition to the OpenAI API.
Ollama API: https://github.com/ollama/ollama/blob/main/docs/api.md
Use Case
Private LLMs
Additional context
Ollama supports part of the OpenAI endpoints (https://github.com/ollama/ollama/blob/main/docs/openai.md). I tried to set the environment variables accordingly but Morphic does not produce any result.
Ollama logs:
On Ollama's side I see a bunch of 400s:
Morphic logs
On Morphic side this is the error:
It seems that Morphic is sending invalid requests to Ollama?
I started working on a PR to add Ollama's support (#190 ). I'm currently testing it.