jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
6.8k stars 547 forks source link

Instructor fails to work with OpenRouter when using Anthropic models #676

Open syberkitten opened 2 months ago

syberkitten commented 2 months ago

What Model are you using?

Describe the bug Instructor fails to work work with openrouter when using Anthropic.

base_url = 'https://openrouter.ai/api/v1' client = instructor.from_openai(AsyncOpenAI(base_url=base_url, api_key={openrouter_api_key})) client.chat.completions.create(**params, model='anthropic/claude-3-sonnet', extra_headers=headers, response_model=response_model)

The we get is: ` ) -> BaseModel: message = completion.choices[0].message assert (

      len(message.tool_calls or []) == 1

), "Instructor does not support multiple tool calls, use List[Model] instead." E AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.`

If

Same code as above for 'openai/gpt-4o' works perfectly for OPENAI (using openrouter) and we get a response model as expected: client.chat.completions.create(**params, model='openai/gpt-4o', extra_headers=headers, response_model=response_model)

If you will try to use the AsyncAnthropic instead of AsyncOpenAI the error gets even weirder and there is a long stack trace, this is the outmost error, something with not reaching the proper endpoint or something.

      raise self._make_status_error_from_response(err.response) from None

E anthropic.NotFoundError

Expected behavior Allow to use Anthropic models over openrouter wether if it by using the openai client or anthropic (which would be better since the APIs are not identical)

ApurvaMisra commented 1 month ago

@syberkitten does instructor work with open router for other models. I tried with mixtral-8x22b-instruct, get this error

AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.

ApurvaMisra commented 1 month ago

I think I found a solution here https://python.useinstructor.com/examples/ollama/#ollama

scruffynerf commented 1 month ago

same bug as https://github.com/jxnl/instructor/issues/735

wasauce commented 1 month ago

@ApurvaMisra did your solution work?

I think I found a solution here https://python.useinstructor.com/examples/ollama/#ollama

ApurvaMisra commented 1 month ago

Yup, this worked for me client = instructor.from_openai(OpenAI( base_url="https://openrouter.ai/api/v1", api_key=os.environ["OPENROUTER_API_KEY"], ), mode=instructor.Mode.JSON, )

vikyw89 commented 1 week ago

found MD_JSON works widely among many models