BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.6k stars 1.71k forks source link

[Bug]: When configuring the DALLE model through the UI, it can lead to model errors. #3981

Open yingapple opened 6 months ago

yingapple commented 6 months ago

What happened?

I configured the mapping rules for DALLE on the UI, and the default name is something like standard/xxx/xx, which is an invalid name.

Relevant log output

***** BadRequestError: Error code: 400 - {'error': {'message': "BadRequestError: OpenAIException - Error code: 400 - {'error': {'code': None, 'message': 'Invalid model standard/1024-x-1024/dall-e-3. The model argument should be left blank.', 'param': None, 'type': 'invalid_request_error'}}\nNumber Retries = 1", 'type': None, 'param': None, 'code': 400}}.

Twitter / LinkedIn details

No response

thfrei commented 2 months ago

For anyone walking into this issue, you could add your own custom model name and just put in dall-e-3

Probably obvious, but I didn't find it right away: image