BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.18k stars 1.68k forks source link

[Bug]: can't add model / Cannot access 'lS' before initialization #6892

Open vid opened 1 day ago

vid commented 1 day ago

What happened?

I'm trying to add an ollama via the ui. I choose Add Model, provider ollama, public model name "llama3.2", litellm model name "llama3.2", provide base URL (same as env below), then hit "Add model," nothing happens, but I see attached errors in console (form resets at that point). I go to "All models," no models listed. I hit reload. I get a page level "Cannot access 'lS' before initialization." I need to clear application memory in the browser to continue.

using OLLAMA_BASE_URLS: http://ollama:11434

image

By the way, the form fields LiteLLM Model Name(s) and Base Url populate with my login credentials by default, which shouldn't happen.

Relevant log output

No response

Twitter / LinkedIn details

No response

vid commented 1 day ago

Just noticed if I enable STORE_MODEL_IN_DB it says it added the model, but no models show up and see the same error after reloading. If I clear cache and log in, the model shows up.