Open Seluj78 opened 11 hours ago
If I am not mistaken, using the Python SDK, I can see the JSON response from claude-3-5-sonnet-20241022
like so (with response_format
== {"type": "json_object"}
)
response.choices[0].message.tool_calls[0].function.arguments
Which I can use but it's extra steps only if I use claude
, if I use gpt
then I can just do response.choices[0].message.content
Can I see the request you're making @Seluj78 ?
response.choices[0].message.tool_calls[0].function.arguments Which I can use but it's extra steps only if I use claude, if I use gpt then I can just do response.choices[0].message.content
Able to repro with this request
litellm.completion(model='claude-3-5-sonnet-20241022', messages=[{'role': 'system', 'content': 'Your output should be a JSON object with no additional properties. '}, {'role': 'user', 'content': 'Respond with this in json. city=San Francisco, state=CA, weather=sunny, temp=60'}], response_format={'type': 'json_object'})
The exact problem was that response.choices[0].message.content
is None because anthropic doesn't support json output inside your package
The Feature
Currently, you only support a small amount of
json
format models : https://docs.litellm.ai/docs/completion/json_modeMotivation, pitch
I would need to be able to do the same with Anthropic models, without having to specify to the model how to output in json
Twitter / LinkedIn details
No response