Open syberkitten opened 6 months ago
@syberkitten does instructor work with open router for other models. I tried with mixtral-8x22b-instruct, get this error
AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.
I think I found a solution here https://python.useinstructor.com/examples/ollama/#ollama
same bug as https://github.com/jxnl/instructor/issues/735
@ApurvaMisra did your solution work?
I think I found a solution here https://python.useinstructor.com/examples/ollama/#ollama
Yup, this worked for me
client = instructor.from_openai(OpenAI( base_url="https://openrouter.ai/api/v1", api_key=os.environ["OPENROUTER_API_KEY"], ), mode=instructor.Mode.JSON, )
found MD_JSON works widely among many models
What Model are you using?
Describe the bug Instructor fails to work work with openrouter when using Anthropic.
base_url = 'https://openrouter.ai/api/v1' client = instructor.from_openai(AsyncOpenAI(base_url=base_url, api_key={openrouter_api_key})) client.chat.completions.create(**params, model='anthropic/claude-3-sonnet', extra_headers=headers, response_model=response_model)
The we get is: ` ) -> BaseModel: message = completion.choices[0].message assert (
If
Same code as above for 'openai/gpt-4o' works perfectly for OPENAI (using openrouter) and we get a response model as expected: client.chat.completions.create(**params, model='openai/gpt-4o', extra_headers=headers, response_model=response_model)
If you will try to use the AsyncAnthropic instead of AsyncOpenAI the error gets even weirder and there is a long stack trace, this is the outmost error, something with not reaching the proper endpoint or something.
Expected behavior Allow to use Anthropic models over openrouter wether if it by using the openai client or anthropic (which would be better since the APIs are not identical)