Open amztc34283 opened 6 months ago
Can't without making some changes, we're using: https://python.langchain.com/docs/modules/model_io/chat/structured_output Should be possible as we add support for local models.
Until then your best bet is a parsing approach, so you'd need to re-write some of the code in the service to use a parsing approach.
Thanks, I asked question about this function, I could probably copy it from the partner folder.
I could also create a PR if this is something you want.
@amztc34283 Were you able to set it up with a local model? I wanna test mistral model through ollama, any idea on this implementation?
How can I use this with local model?