BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

[Bug]: Setting `PROXY_BASE_URL` does not work. #6815

Closed dipakparmar closed 17 hours ago

dipakparmar commented 2 days ago

What happened?

Setting PROXY_BASE_URL does not work.

I see https://github.com/BerriAI/litellm/blob/1890fde3f377b0896e7ca5908953ef948ec1c8a7/ui/litellm-dashboard/src/app/page.tsx#L73-L76 here it's not looking at any env var to populate the state.

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 17 hours ago

@dipakparmar it's loaded in from env here - https://github.com/BerriAI/litellm/blob/ddfe687b13e9f31db2fb2322887804e3d01dd467/litellm/proxy/management_endpoints/ui_sso.py#L645

and sent to the client via the /sso/get/ui_settings endpoint

please share a minimal repro of a specific issue you face.