Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.91k stars 912 forks source link

Installation error - prisma #117

Closed Kailegh closed 1 year ago

Kailegh commented 1 year ago

I am trying to run chainlit inside a container build with the following Dockerfile

FROM python:3.11-slim-buster
WORKDIR /app

RUN apt-get update && \
    apt-get install nodejs -y && \
    apt-get clean

COPY requirements.txt /app
#RUN  pip install --upgrade pip
RUN  pip install -r requirements.txt

COPY . /app

RUN prisma generate
ENV HOST=0.0.0.0
ENV LISTEN_PORT 8000
EXPOSE 8000
CMD ["/bin/bash", "launch_scripts/chainlit_app_custom.sh"]

When I try to run it I get the following error:

chat_ui-chat_ui-1  | 2023-06-29 09:03:53 - Loaded .env file
chat_ui-chat_ui-1  | 2023-06-29 09:03:54 - Your app is available at http://localhost:8000
chat_ui-chat_ui-1  | ERROR:    Traceback (most recent call last):
chat_ui-chat_ui-1  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 677, in lifespan
chat_ui-chat_ui-1  |     async with self.lifespan_context(app) as maybe_state:
chat_ui-chat_ui-1  |   File "/usr/local/lib/python3.11/contextlib.py", line 204, in __aenter__
chat_ui-chat_ui-1  |     return await anext(self.gen)
chat_ui-chat_ui-1  |            ^^^^^^^^^^^^^^^^^^^^^
chat_ui-chat_ui-1  |   File "/usr/local/lib/python3.11/site-packages/chainlit/server.py", line 65, in lifespan
chat_ui-chat_ui-1  |     from prisma import Client, register
chat_ui-chat_ui-1  |   File "<frozen importlib._bootstrap>", line 1229, in _handle_fromlist
chat_ui-chat_ui-1  |   File "/usr/local/lib/python3.11/site-packages/prisma/__init__.py", line 45, in __getattr__
chat_ui-chat_ui-1  |     raise RuntimeError(
chat_ui-chat_ui-1  | RuntimeError: The Client hasn't been generated yet, you must run `prisma generate` before you can use the client.
chat_ui-chat_ui-1  | See https://prisma-client-py.readthedocs.io/en/stable/reference/troubleshooting/#client-has-not-been-generated-yet
chat_ui-chat_ui-1  | 
chat_ui-chat_ui-1  | ERROR:    Application startup failed. Exiting.

I thought it was an error in my application and used prisma generate as explained in the error, but it still did not work. So I went back to the basics:

pip install chainlit==0.4.2 --force-reinstall
chainlit hello

And still got the same error:

ERROR:    Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 677, in lifespan
    async with self.lifespan_context(app) as maybe_state:
  File "/usr/local/lib/python3.11/contextlib.py", line 204, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/chainlit/server.py", line 65, in lifespan
    from prisma import Client, register
  File "<frozen importlib._bootstrap>", line 1229, in _handle_fromlist
  File "/usr/local/lib/python3.11/site-packages/prisma/__init__.py", line 45, in __getattr__
    raise RuntimeError(
RuntimeError: The Client hasn't been generated yet, you must run `prisma generate` before you can use the client.
See https://prisma-client-py.readthedocs.io/en/stable/reference/troubleshooting/#client-has-not-been-generated-yet

ERROR:    Application startup failed. Exiting.

Any ideas of what may be happening?

willydouhard commented 1 year ago

Hey the generate command is supposed to be handled by Chainlit. Can I see your chainlit run ... command and your .chainlit/config.toml? Also is there already a chat.db file in your.chainlit folder?

Kailegh commented 1 year ago

Ok nevermind, tried removing the .chainlit folder and I think it just started working, thanks a lot!!