jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
6.7k stars 532 forks source link

general way or tutorial to support any LLM provider #733

Closed jtoy closed 3 days ago

jtoy commented 1 month ago

Is your feature request related to a problem? Please describe. I see it supports many LLM providers, but its unclear to me if we need to code up support for every provider. For example does it work with groq llama3-70b?

Describe the solution you'd like Can we have a tutorial and or generic interface to show how to make this work for any provider. This will increase adoption of the library.

tom-pham-visus commented 1 month ago

For example does it work with groq llama3-70b?

As long as that model can produced structured responses, I don't see why not. Have you tried it yourself first?

Edit: it seems like this library is supporting a limited number of LLMs at the moment.

Tedfulk commented 1 month ago

@jtoy Yes they support groq llama3 70b. groq-example

ivanleomk commented 3 days ago

@jtoy the easiest way to support new libraries is to use the OpenAI Client since that will use the existing codebase's functionality. If not, you can easily support a new inference provider by adding a new implementation of the Instructor base class.

Happy to take PRs if you've got any inference provider in mind but closing this issue for now.