BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.01k stars 1.66k forks source link

[Bug]: Bedrock stability ultra getting error message #6826

Open rasodu opened 4 hours ago

rasodu commented 4 hours ago

What happened?

[ERROR: litellm.BadRequestError: BedrockException - {"message":"{\"detail\":\"Invalid field in request. Available fields: ['prompt', 'negative_prompt', 'mode', 'strength', 'seed', 'output_format', 'image', 'aspect_ratio']\"}"} Received Model Group=sd_ultra_v1 Available Model Group Fallbacks=None]

Relevant log output

No response

Twitter / LinkedIn details

No response

rasodu commented 4 hours ago

6722 : I had required the support for new model here. Getting different error message now. The way I use model is explained in the the previous issue.