BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.11k stars 1.13k forks source link

[Bug]: liteLLM proxy /moderations endpoint returns 500 error when model is not specified #4336

Closed malagna-amplify closed 1 week ago

malagna-amplify commented 1 week ago

What happened?

If you try to hit the moderations endpoint without specifying a model, you get the following error: {"error":{"message":"'model'","type":"None","param":"None","code":500}}

Expected behavior: Model should be optional. When hitting openAI directly without model, it defaults to text-moderation-latest.

EDIT: It also appears that if I try to specify the model, I instead get a 400 error... "model": "text-moderation-latest" error: {"error":{"message":"400: {'error': 'moderations: Invalid model name passed in model=text-moderation-latest'}","type":"None","param":"None","code":400}}

Relevant log output

No response

Twitter / LinkedIn details

No response

ishaan-jaff commented 1 week ago

working on this

ishaan-jaff commented 1 week ago

Fix PR here: https://github.com/BerriAI/litellm/pull/4342

ishaan-jaff commented 1 week ago

also added testing to ensure we don't push a regression on this

ishaan-jaff commented 1 week ago

@malagna-amplify  any chance we can hop on a call ? I'd love to learn how how we can improve litellm for you.

I reached out to you on Linkedin if DMs work. Sharing a link to my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

malagna-amplify commented 2 days ago

@ishaan-jaff Thanks for reaching out! I put time on your calendar to discuss.