Open nobu007 opened 2 months ago
Hey @nobu007, thanks for testing this with Anthropic and making this fix!
Should we just set litellm.drop_params = True
? It seems like that could never be bad— if LiteLLM doesn't want to pass a param into the LLM, I feel like we should just let it silently drop it. Any side effects of that that you can see?
@KillianLucas I added "llm_modify_params" also. Please check it!
@KillianLucas I rebased.
litellm.drop_params=True can avoid this error(https://github.com/OpenInterpreter/open-interpreter/issues/1240). Error in chat: AnthropicException - anthropic does not support parameters I think drop_params is good altanative for debugging or workaround of future errors.
Describe the changes you have made:
This option allow litellm.drop_params=True.
Reference any relevant issues (e.g. "Fixes #000"):
Related with https://github.com/OpenInterpreter/open-interpreter/issues/1240.
Pre-Submission Checklist (optional but appreciated):
docs/CONTRIBUTING.md
docs/ROADMAP.md
I found "Use Anthropic function calling" in ROADMAP.md. Is #1240 just not implemaneted?
OS Tests (optional but appreciated):
Test(works fine)
Note1
This error is probably resolved by this PR.
Note2
"litellm.modify_params" helps activate here code in litellm. The first message is empty or non user, dummy message append in anthropic model. It is not important, but helps for me.