langchain-ai / langchain-extract

🦜⛏️ Did you say you like data?
https://extract.langchain.com/
MIT License
1.03k stars 107 forks source link

Use this with local model #129

Open amztc34283 opened 6 months ago

amztc34283 commented 6 months ago

How can I use this with local model?

eyurtsev commented 6 months ago

Can't without making some changes, we're using: https://python.langchain.com/docs/modules/model_io/chat/structured_output Should be possible as we add support for local models.

Until then your best bet is a parsing approach, so you'd need to re-write some of the code in the service to use a parsing approach.

amztc34283 commented 5 months ago

Thanks, I asked question about this function, I could probably copy it from the partner folder.

amztc34283 commented 5 months ago

I could also create a PR if this is something you want.

gaj995 commented 5 months ago

@amztc34283 Were you able to set it up with a local model? I wanna test mistral model through ollama, any idea on this implementation?