BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.98k stars 1.65k forks source link

[Bug]: Swagger UI doesn't load in offline environment #5737

Open cardin opened 2 months ago

cardin commented 2 months ago

What happened?

When hosting LiteLLM in an offline environment, Swagger doesn't load as it tries to fetch swagger-ui.css and other related swagger files from https://cdn.jsdelivr.net.

As a result of not being able to fetch the assets, a blank page is displayed.

LiteLLM should be configured to ask Swagger to use a locally held version of the Swagger assets.

Relevant log output

No response

Twitter / LinkedIn details

No response

nicolasesprit commented 1 month ago

Same here.

A global check is needed to remove every internet dependencies for offline deployement