BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.85k stars 1.63k forks source link

[Feature]: json_schema support for Anthropic #6741

Open Seluj78 opened 11 hours ago

Seluj78 commented 11 hours ago

The Feature

Currently, you only support a small amount of json format models : https://docs.litellm.ai/docs/completion/json_mode

Motivation, pitch

I would need to be able to do the same with Anthropic models, without having to specify to the model how to output in json

Twitter / LinkedIn details

No response

Seluj78 commented 11 hours ago

If I am not mistaken, using the Python SDK, I can see the JSON response from claude-3-5-sonnet-20241022 like so (with response_format == {"type": "json_object"})

response.choices[0].message.tool_calls[0].function.arguments

Which I can use but it's extra steps only if I use claude, if I use gpt then I can just do response.choices[0].message.content

ishaan-jaff commented 3 hours ago

Can I see the request you're making @Seluj78 ?

response.choices[0].message.tool_calls[0].function.arguments Which I can use but it's extra steps only if I use claude, if I use gpt then I can just do response.choices[0].message.content

ishaan-jaff commented 3 hours ago

Able to repro with this request

litellm.completion(model='claude-3-5-sonnet-20241022', messages=[{'role': 'system', 'content': 'Your output should be a JSON object with no additional properties.  '}, {'role': 'user', 'content': 'Respond with this in json. city=San Francisco, state=CA, weather=sunny, temp=60'}], response_format={'type': 'json_object'})
Seluj78 commented 21 minutes ago

The exact problem was that response.choices[0].message.content is None because anthropic doesn't support json output inside your package