BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.62k stars 1.59k forks source link

LiteLLM Proxy Startup Error: TypeError in check_view_exists() #5702

Closed kishan-getstarted closed 1 month ago

kishan-getstarted commented 1 month ago

When attempting to start the LiteLLM proxy server following the quick start guide, I encountered an error during the application startup process. The error occurs in the check_view_exists() function and seems to be related to handling a None value.

Steps to Reproduce

Follow the quick start guide at https://docs.litellm.ai/docs/proxy/quick_start Attempt to start the LiteLLM proxy server

Error Message

`ERROR:    Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 732, in lifespan
    async with self.lifespan_context(app) as maybe_state:
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 608, in __aenter__
    await self._router.startup()
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 709, in startup
    await handler()
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/proxy/proxy_server.py", line 2920, in startup_event
    create_view_response = await prisma_client.check_view_exists()
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/backoff/_async.py", line 151, in retry
    ret = await target(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/proxy/utils.py", line 995, in check_view_exists
    if required_view not in ret[0]["view_names"]:
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable

ERROR:    Application startup failed. Exiting.`
ksquarekumar commented 1 month ago

I am also facing the same issue on a kubernetes deployment, running prisma commands like push and validate from the pod shell works with DATABASE_URL set, but starting with litellm --port 4000 --config /app/proxy_server_config.yaml leads to this error where the workers fail to start.

I am running ghcr.io/berriai/litellm:main-latest

ishaan-jaff commented 1 month ago

@kishan-getstarted @ksquarekumar fix will be on 1.46.0

agileben commented 1 month ago

@ishaan-jaff I tried a fresh deploy of 1.46.0 on railway but same error I think:

INFO: Started server process [1]

INFO: Waiting for application startup.

ERROR: Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 732, in lifespan

async with self.lifespan_context(app) as maybe_state:

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 608, in aenter

await self._router.startup()

File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 709, in startup

await handler()

File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 2905, in startup_event

create_view_response = await prisma_client.check_view_exists()

                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/backoff/_async.py", line 151, in retry

ret = await target(*args, **kwargs)

      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 995, in check_view_exists

if required_view not in ret[0]["view_names"]:

   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

TypeError: argument of type 'NoneType' is not iterable

ERROR: Application startup failed. Exiting.

kishan-getstarted commented 1 month ago

I have taken a pull from main. I have updated the litellm I am still facing the same issue I m just doing docker-compose up

am I doing something wrong here ?

krrishdholakia commented 1 month ago

Hi everyone, thank you for trying this so far. We believe we have a fix on main for this now.

It's not published yet due to some ci/cd issues that we're working through. Hoping to have this fixed by EOD.

krrishdholakia commented 1 month ago

@kishan-getstarted i can confirm this works for me on main latest with a new db. This is the new warning you should now be seeing - https://github.com/BerriAI/litellm/blob/8d4339c702ebcfa48c7a0d82a9c213a643e3016f/litellm/proxy/utils.py#L1029

Screenshot 2024-09-17 at 1 57 12 PM
kishan-getstarted commented 1 month ago

yeah okay so far I have to change the docker-compose file to change the db. I can see it should take from the .env but it was not taking after few retry I managed to resolved the previous error now new one is coming as below

Screenshot 2024-09-18 at 3 02 17 AM Screenshot 2024-09-18 at 3 02 47 AM

I can confirm my PG is running.

may be its my system issue idk tbh..

krrishdholakia commented 1 month ago

This looks like an unrelated issue @kishan-getstarted

looks like your DB is either not running or the url is incorrect

kishan-getstarted commented 1 month ago

@krrishdholakia it works fine now.. I had to do docker compose down && docker compose up --build -d. and the docker-compose up

now I am able to run the app locally. I am happy to close this thread :)

Thanks for your response and help !