Closed markokraemer closed 1 month ago
Hi @markokraemer, great seeing you on this corner of the internet. It's been a little while, thanks for the bug submission!
Do you have code to reproduce this so I can add some tests?
Turning on litellm modify_params seems like the right thing to do and is probably pretty safe but I want to make sure it solves your issue.
Thanks for working on this! : )
This is my code (part of it) agent_base.py https://pastebin.com/raw/GD2EcW7X run_agent.py https://pastebin.com/raw/weCabkd6
Yeah, the issue is that Anthropic treats it differently than OpenAI, so it's not a one-line replacement without the little modify_params – it would be convenient to have it automatically add the placeholder message.
For some reason I'm not able to reproduce the issue using your code. I had to take a bunch of stuff out to get it to run without the missing pieces so maybe I'm missing some code path?
Do you have a trace that shows where in the code it blows up?
Here's what I'm running:
https://github.com/datastax/astra-assistants-api/tree/issue-50-repro/repro
Any chance you have a minimal reproducible example?
I also test with anthropic here and it does not fail in ci or locally.
Thanks for your patience!
Hey @markokraemer is this still blocking you? I can just add the setting in a branch if you're willing to test it.
Hi @markokraemer I made a branch and docker image for you with the setting:
https://github.com/datastax/astra-assistants-api/tree/ISSUE-50
Let me know if your issue goes away?
openai.InternalServerError: Error code: 500 - {'message': "Error: litellm.BadRequestError: AnthropicException - litellm.BadRequestError: AnthropicException - Invalid first message. Should always start with 'role'='user' for Anthropic. System prompt is sent separately for Anthropic. set 'litellm.modify_params = True' or 'litellm_settings:modify_params = True' on proxy, to insert a placeholder user message - '.' as the first message, "}