Currently, Ollama support is an unstable and experimental feature. This feature is implemented using the Ollama AI Provider. It is explicitly stated that it is unstable in Object generation and Tool usage. Additionally, Tool streaming is not supported. Morphic is very unstable because it requires these capabilities. Please use it with the understanding of these limitations.
Environment Variables
OLLAMA_MODEL=[YOUR_OLLAMA_MODEL]
The main model to use. Recommended: mistral or openherms
Overview
Currently, Ollama support is an unstable and experimental feature. This feature is implemented using the Ollama AI Provider. It is explicitly stated that it is unstable in
Object generation
andTool usage
. Additionally,Tool streaming
is not supported. Morphic is very unstable because it requires these capabilities. Please use it with the understanding of these limitations.Environment Variables
mistral
oropenherms
phi3
orllama3
http://localhost:11434
PR
214
190
200