BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.55k stars 1.19k forks source link

[Feature]: Support temperature parameter for Bedrock Claude3 #2348

Open zthang opened 4 months ago

zthang commented 4 months ago

The Feature

Support temperature parameter for Bedrock Claude3 model.

Motivation, pitch

When passing temperature parameter to the litellm api using Bedrock Claude3 model, it raises litellm.utils.UnsupportedParamsError: bedrock does not support parameters: {'temperature': 0}. To drop these, set litellm.drop_params=True. However, I have tested that bedrock supports Claude3's temperature parameter.

Twitter / LinkedIn details

No response

KyleZhang0536 commented 4 months ago

I think incorporating the temperature into the get_supported_openai_params function in the bedrock.py file should be effective.

Manouchehri commented 3 months ago

temperature seems to be supported.

https://github.com/BerriAI/litellm/blob/d89644d46cf3bd04167eda9f7d1b79be125fc892/litellm/llms/bedrock.py#L80-L100

Could you give an example PoC to reproduce the issue, if it still exists? 😄

KyleZhang0536 commented 3 months ago

temperature seems to be supported.

https://github.com/BerriAI/litellm/blob/d89644d46cf3bd04167eda9f7d1b79be125fc892/litellm/llms/bedrock.py#L80-L100

Could you give an example PoC to reproduce the issue, if it still exists? 😄

Thanks, it works now.