BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.88k stars 1.64k forks source link

[Bug]: not being able to find the routing strategy arg in the UI screen. #5889

Open FrostScent opened 1 month ago

FrostScent commented 1 month ago

What happened?

I am using OSS Litellm. I have set routing_strategy_args in the config.yaml file, but I cannot verify it in the Litellm UI. I am not sure if the arguments are working correctly. Below, I have attached my config.yaml file and the UI screen.

image

image

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

@FrostScent i see it on your screenshot - look at routing_strategy (first item on list)

Screenshot 2024-09-25 at 11 10 58 AM
FrostScent commented 1 month ago

Thanks for response @krrishdholakia :)

I think I might have spoken confusingly. I was referring to routing_strategy_args, not routing_strategy. In my config.yaml file, there is an entry routing_strategy_args: {"ttl":10}, but at the bottom of the screenshot, the Routing Strategy Specific Args section states "No specific settings." (lowest_latency_buffer has same problem i think)

Could you please check it again? Sorry for the inconvenience.