Open DiyarD opened 1 month ago
Related to https://github.com/ollama/ollama/issues/6473.
Basically, Ollama does not support structured generation, only JSON mode. JSON mode does not make any guarantees that you'll get the schema you asked for, or even that it will be valid. For the moment, I would recommend LM Studio as an alternative that supports Outlines. Here's a guide to using it.
You can also use the instructor library or ollama-instructor (https://github.com/lennartpollvogt/ollama-instructor).
Beware that Instructor doesn't do structured generation, it provides a way to parse the output of the LLM into a Pydantic object.
Describe the issue as clearly as possible:
I'm trying to use Ollama with the Openai api to generate a Pydantic object. I think that's because Ollama doesn't support
response_format
which is used in/outlines/generate/json.py
:Ollama does have a
format: json
mode but that doesn't support adding a json schema.Is there any workaround for this?
Steps/code to reproduce the bug:
Expected result:
Error message:
Outlines/Python version information:
Version information
Context for the issue:
No response