BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.22k stars 1.42k forks source link

[Bug]: prisma.errors.DataError: The column `requester_ip_address` does not exist in the current database. #4876

Open devdev999 opened 1 month ago

devdev999 commented 1 month ago

What happened?

I am getting error of prisma.errors.DataError: The column 'requester_ip_address' does not exist in the current database. The proxy still works but latest spend is not being tracked. The LiteLLM container does not have access to the internet. From what I can gather from the logs it is trying to install some dependency from online which then fails.

There are some errors with prisma on LiteLLM container startup:


Installing Prisma CLI

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/urllib/request.py", line 1348, in do_open

    h.request(req.get_method(), req.selector, req.data, headers,

  File "/usr/local/lib/python3.11/http/client.py", line 1298, in request

    self._send_request(method, url, body, headers, encode_chunked)

  File "/usr/local/lib/python3.11/http/client.py", line 1344, in _send_request

    self.endheaders(body, encode_chunked=encode_chunked)

  File "/usr/local/lib/python3.11/http/client.py", line 1293, in endheaders

    self._send_output(message_body, encode_chunked=encode_chunked)

  File "/usr/local/lib/python3.11/http/client.py", line 1052, in _send_output

    self.send(msg)

  File "/usr/local/lib/python3.11/http/client.py", line 990, in send

    self.connect()

  File "/usr/local/lib/python3.11/http/client.py", line 1463, in connect

    super().connect()

  File "/usr/local/lib/python3.11/http/client.py", line 956, in connect

    self.sock = self._create_connection(

                ^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/socket.py", line 827, in create_connection

    for res in getaddrinfo(host, port, 0, SOCK_STREAM):

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/socket.py", line 962, in getaddrinfo

    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "<frozen runpy>", line 198, in _run_module_as_main

  File "<frozen runpy>", line 88, in _run_code

  File "/usr/local/lib/python3.11/site-packages/nodeenv.py", line 1548, in <module>

    main()

  File "/usr/local/lib/python3.11/site-packages/nodeenv.py", line 1119, in main

    args.node = get_last_stable_node_version()

                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/nodeenv.py", line 1052, in get_last_stable_node_version

    return _get_versions_json()[0]['version'].lstrip('v')

           ^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/nodeenv.py", line 1028, in _get_versions_json

    response = urlopen('%s/index.json' % src_base_url)

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/nodeenv.py", line 652, in urlopen

    return urllib2.urlopen(req)

           ^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 216, in urlopen

    return opener.open(url, data, timeout)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 519, in open

    response = self._open(req, data)

               ^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 536, in _open

    result = self._call_chain(self.handle_open, protocol, protocol +

             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 496, in _call_chain

    result = func(*args)

             ^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 1391, in https_open

    return self.do_open(http.client.HTTPSConnection, req,

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/urllib/request.py", line 1351, in do_open

    raise URLError(err)

urllib.error.URLError: <urlopen error [Errno -3] Temporary failure in name resolution>

nodeenv installation failed; You may want to try installing `nodejs-bin` as it is more reliable.

Traceback (most recent call last):

  File "/usr/local/bin/prisma", line 8, in <module>

    sys.exit(main())

             ^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/cli.py", line 39, in main

    sys.exit(prisma.run(args[1:]))

             ^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/prisma.py", line 36, in run

    entrypoint = ensure_cached().entrypoint

                 ^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/prisma.py", line 93, in ensure_cached

    proc = npm.run(

           ^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/_proxy.py", line 19, in __getattr__

    return getattr(self.__get_proxied__(), attr)

                   ^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/_proxy.py", line 35, in __get_proxied__

    self.__proxied = proxied = self.__load__()

                               ^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/_node.py", line 406, in __load__

    return resolve(self.target)

           ^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/_node.py", line 282, in resolve

    return NodeBinaryStrategy.resolve(target)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/_node.py", line 161, in resolve

    return NodeBinaryStrategy.from_nodeenv(target)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/_node.py", line 191, in from_nodeenv

    raise exc

  File "/usr/local/lib/python3.11/site-packages/prisma/cli/_node.py", line 174, in from_nodeenv

    subprocess.run(

  File "/usr/local/lib/python3.11/subprocess.py", line 571, in run

    raise CalledProcessError(retcode, process.args,

subprocess.CalledProcessError: Command '['/usr/local/bin/python', '-m', 'nodeenv', '/root/.cache/prisma-python/nodeenv']' returned non-zero exit status 1.

Relevant log output

Job "update_spend (trigger: interval[0:00:57], next run at: 2024-07-25 16:24:43 +08)" raised an exception

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/apscheduler/executors/base_py3.py", line 30, in run_coroutine_job

    retval = await job.func(*job.args, **job.kwargs)

             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 2699, in update_spend

    raise e

  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 2660, in update_spend

    await prisma_client.db.litellm_spendlogs.create_many(

  File "/usr/local/lib/python3.11/site-packages/prisma/actions.py", line 9489, in create_many

    resp = await self._client._execute(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/client.py", line 525, in _execute

    return await self._engine.query(builder.build(), tx_id=self._tx_id)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/engine/query.py", line 244, in query

    return await self.request(

           ^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/engine/http.py", line 141, in request

    return utils.handle_response_errors(resp, errors_data)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/prisma/engine/utils.py", line 200, in handle_response_errors

    raise prisma_errors.DataError(data[0])

prisma.errors.DataError: The column `requester_ip_address` does not exist in the current database.

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

hey @devdev999 are you able to tell what version of prisma is running?

krrishdholakia commented 1 month ago

@devdev999 Can you try running the litellm-database dockerfile - https://github.com/BerriAI/litellm/pkgs/container/litellm-database

It pre-generates prisma, which i think might solve this issue

yeounhak commented 1 month ago

What is the difference between litellm and litellm-database?

Thank you.

krrishdholakia commented 1 month ago

Hey @devdev999 i was able to fix this by using litellm's dockerfile as a base image and running prisma generate as part of my dockerfile -

# Use the provided base image
FROM ghcr.io/berriai/litellm:main-latest

# Set the working directory to /app
WORKDIR /app

### [👇 KEY STEP] ###
# Install Prisma CLI and generate Prisma client
RUN pip install prisma 
RUN prisma generate
### FIN #### 

# Expose the necessary port
EXPOSE 4000

# Override the CMD instruction with your desired command and arguments
# WARNING: FOR PROD DO NOT USE `--detailed_debug` it slows down response times, instead use the following CMD
# CMD ["--port", "4000", "--config", "config.yaml"]

# Define the command to run your app
ENTRYPOINT ["litellm"]

CMD ["--port", "4000"]

Docs: https://docs.litellm.ai/docs/proxy/deploy#litellm-without-internet-connection

devdev999 commented 1 month ago

I get the same error still even after running the image built using this Dockerfile, how do I verify that that the prisma binaries are inside the image? Where are they stored?

krrishdholakia commented 1 month ago

i ran prisma --version inside the docker terminal

this should show you the binaries

krrishdholakia commented 1 month ago

if you see it print out installing node... then you know it's not installed

devdev999 commented 1 month ago

Verified that it has the binaries in the built image, still getting the same container startup error and requester_ip_address not found. Tried both ghcr.io/berriai/litellm:main-latest and ghcr.io/berriai/litellm:main-v1.42.5-stable as base image

prisma                : 5.4.2
@prisma/client        : Not found
Current platform      : debian-openssl-3.0.x
Query Engine (Binary) : query-engine ac9d7041ed77bcc8a8dbd2ab6616b39013829574 (at .prisma/.cache/prisma-python/binaries/5.4.2/ac9d7041ed77bcc8a8dbd2ab6616b39013829574/node_modules/@prisma/engines/query-engine-debian-openssl-3.0.x)
Schema Engine         : schema-engine-cli ac9d7041ed77bcc8a8dbd2ab6616b39013829574 (at .prisma/.cache/prisma-python/binaries/5.4.2/ac9d7041ed77bcc8a8dbd2ab6616b39013829574/node_modules/@prisma/engines/schema-engine-debian-openssl-3.0.x)
Schema Wasm           : @prisma/prisma-schema-wasm 5.4.1-2.ac9d7041ed77bcc8a8dbd2ab6616b39013829574
Default Engines Hash  : ac9d7041ed77bcc8a8dbd2ab6616b39013829574
Studio                : 0.494.0
krrishdholakia commented 1 month ago

hey @devdev999 unable to repro the same behaviour. Here's a simple test case:

my base test for this is

Create a simple dockerfile

# Use the provided base image
FROM ghcr.io/berriai/litellm:main-latest

# Set the working directory to /app
WORKDIR /app

### [👇 KEY STEP] ###
# Install Prisma CLI and generate Prisma client
RUN pip install prisma 
RUN prisma generate
### FIN #### 

# Expose the necessary port
EXPOSE 4000

# Override the CMD instruction with your desired command and arguments
# WARNING: FOR PROD DO NOT USE `--detailed_debug` it slows down response times, instead use the following CMD
# CMD ["--port", "4000", "--config", "config.yaml"]

# Define the command to run your app
ENTRYPOINT ["litellm"]

CMD ["--port", "4000"]
  1. Create a docker network w/ no internet connection
docker network create --internal my_internal_network
  1. Run docker image
docker run --network my_internal_network -p 4000:4000 my-docker-image
krrishdholakia commented 1 month ago

This script works for me without issues (no startup errors)

devdev999 commented 1 month ago

Tried your example and it indeed works fine. The error only seems to occur when connecting to Postgres as external DB.

Reproducible example docker-compose.yml:

services:
  litellm:
    image: ghcr.io/berriai/litellm:main-v1.42.5-stable-patched-latest
    ports:
      - "4000:4000" # Map the container port to the host, change the host port if necessary
    environment:
      DATABASE_URL: "postgresql://llmproxy:dbpassword9090@db:5432/litellm"
    networks:
      - my_internal_network

  db:
    image: postgres:16.3
    restart: always
    environment:
      POSTGRES_DB: litellm
      POSTGRES_USER: llmproxy
      POSTGRES_PASSWORD: dbpassword9090
    networks:
      - my_internal_network

networks:
  my_internal_network:
    external: true
devdev999 commented 1 month ago

Seems to be linked to #4915

devdev999 commented 1 month ago

@krrishdholakia Traced the docker image versions that this started happening to 1.41.18, it did not encounter any prisma startup errors on 1.14.17. From the changelog https://github.com/BerriAI/litellm/compare/v1.41.17...v1.41.18 and tag https://github.com/BerriAI/litellm/releases/tag/v1.41.18 the only prisma related change was in #4640

ishaan-jaff commented 1 month ago

hi @devdev999 thanks for your work - we reverted #4640. Will check if issues persists on new release

@devdev999 can we setup a 1:1 support channel - I'd love to prioritize your issues: