Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.67k stars 860 forks source link

RuntimeError: The Client hasn't been generated yet, you must run `prisma generate` before you can use the client. #250

Closed anyoneai closed 1 year ago

anyoneai commented 1 year ago

Hi there! I'm playing around with Chainlit. When using database = "local" in the config.toml file I'm able to spin up the app correctly for the first time but, if I shut down and try to launch again I get the following error:

backend_1  | 2023-08-02 18:55:05 - Your app is available at http://localhost:8000                  
backend_1  | ERROR:    Traceback (most recent call last):                                                                                                                                             
backend_1  |   File "/home/app/.local/lib/python3.10/site-packages/starlette/routing.py", line 677, in lifespan
backend_1  |     async with self.lifespan_context(app) as maybe_state:                             
backend_1  |   File "/usr/local/lib/python3.10/contextlib.py", line 199, in __aenter__             
backend_1  |     return await anext(self.gen)                                                      
backend_1  |   File "/home/app/.local/lib/python3.10/site-packages/chainlit/server.py", line 59, in lifespan
backend_1  |     from prisma import Client, register
backend_1  |   File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist                                                                                                                   
backend_1  |   File "/home/app/.local/lib/python3.10/site-packages/prisma/__init__.py", line 45, in __getattr__                                                                                       
backend_1  |     raise RuntimeError(                                                                                                                                                                  
backend_1  | RuntimeError: The Client hasn't been generated yet, you must run `prisma generate` before you can use the client.                                                                        
backend_1  | See https://prisma-client-py.readthedocs.io/en/stable/reference/troubleshooting/#client-has-not-been-generated-yet
backend_1  |                                                                                       
backend_1  | ERROR:    Application startup failed. Exiting.

Any thoughts?

willydouhard commented 1 year ago

Are you starting the app in a fresh python env every time? We only generate the prisma types once when we create the DB. If you switch py envs after that you will get this error. You can run chainlit migrate to re generate the prisma types manually or delete the .config/chat.db file.

anyoneai commented 1 year ago

Ah I see, yeah I'm running this in a Docker container

wei-ann-Github commented 1 year ago

Hi @willydouhard ,

Is it possible to continue writing to .config/chat.db everytime when I start the container? I seem quite a pain to have to delete/backup and db everytime.

By th way, I am using mapped volumes in the container, so chat.db persists even after I have shutdown the container?

You menion chainlit migrate. When can I run this in the dockerfile? here is my dockerfile

ARG CURRENT_DIR
FROM python:3.10

# Set the working directory to /chainlit
WORKDIR /chainlit_app

# Copy the requirements.txt and app.py from the host to the container
COPY $CURRENT_DIR/requirements.txt .
COPY $CURRENT_DIR/chainlit/app.py .

# Install the requirements
RUN pip install --no-cache-dir -r requirements.txt && \
    chainlt migrate

CMD ["chainlit", "run", "app.py", "-w"]
willydouhard commented 1 year ago

You can just add a RUN statement imo. In you current code you have a typo "chainlt" is supposed to be "chainlit".

wei-ann-Github commented 1 year ago

You can just add a RUN statement imo. In you current code you have a typo "chainlt" is supposed to be "chainlit".

@willydouhard , Cheers, for the typo spot. The error is still there though:

chainlit_chatui-chainlit-1  | ERROR:    Traceback (most recent call last):
chainlit_chatui-chainlit-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 677, in lifespan
chainlit_chatui-chainlit-1  |     async with self.lifespan_context(app) as maybe_state:
chainlit_chatui-chainlit-1  |   File "/usr/local/lib/python3.10/contextlib.py", line 199, in __aenter__
chainlit_chatui-chainlit-1  |     return await anext(self.gen)
chainlit_chatui-chainlit-1  |   File "/usr/local/lib/python3.10/site-packages/chainlit/server.py", line 59, in lifespan
chainlit_chatui-chainlit-1  |     from prisma import Client, register  # type: ignore[attr-defined]
chainlit_chatui-chainlit-1  |   File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
chainlit_chatui-chainlit-1  |   File "/usr/local/lib/python3.10/site-packages/prisma/__init__.py", line 45, in __getattr__
chainlit_chatui-chainlit-1  |     raise RuntimeError(
chainlit_chatui-chainlit-1  | RuntimeError: The Client hasn't been generated yet, you must run `prisma generate` before you can use the client.
chainlit_chatui-chainlit-1  | See https://prisma-client-py.readthedocs.io/en/stable/reference/troubleshooting/#client-has-not-been-generated-yet

I realized, as long as I don't rebuild the container, the message does not appear.