Closed jtoy closed 3 days ago
For example does it work with groq llama3-70b?
As long as that model can produced structured responses, I don't see why not. Have you tried it yourself first?
Edit: it seems like this library is supporting a limited number of LLMs at the moment.
@jtoy Yes they support groq llama3 70b. groq-example
@jtoy the easiest way to support new libraries is to use the OpenAI Client since that will use the existing codebase's functionality. If not, you can easily support a new inference provider by adding a new implementation of the Instructor
base class.
Happy to take PRs if you've got any inference provider in mind but closing this issue for now.
Is your feature request related to a problem? Please describe. I see it supports many LLM providers, but its unclear to me if we need to code up support for every provider. For example does it work with groq llama3-70b?
Describe the solution you'd like Can we have a tutorial and or generic interface to show how to make this work for any provider. This will increase adoption of the library.