dottxt-ai / outlines

Structured Text Generation
https://dottxt-ai.github.io/outlines/
Apache License 2.0
9.57k stars 490 forks source link

FastAPI x Outlines : RuntimeError: Cannot run the event loop while another loop is running #1252

Open rcourivaud opened 2 weeks ago

rcourivaud commented 2 weeks ago

Describe the issue as clearly as possible:

I've developed an API with FastAPI and I'm trying to use outlines to extract data from text.

But I have an error with asynchronous loops. I've tried running outlines in another thread, disabling it but it doesn't work. Do you have any tips?

There is the error :

 web-1  |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 70, in app
web-1  |     response = await func(request)
web-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 299, in app
web-1  |     raise e
web-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 294, in app
web-1  |     raw_response = await run_endpoint_function(
web-1  |   File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
web-1  |     return await dependant.call(**values)
web-1  |   File "/app/app/routers/equipment.py", line 20, in text_to_equipment
web-1  |     return await create_equipment_from_text(equipment_text.text)
web-1  |   File "/app/app/services/equipment.py", line 9, in create_equipment_from_text
web-1  |     equipment = await build_equipment_from_string(text)
web-1  |   File "/app/app/dependencies/outlines/services/equipments.py", line 28, in build_equipment_from_string
web-1  |     base_equipment = await build_base_equipment_from_string(text)
web-1  |   File "/app/app/dependencies/outlines/services/equipments.py", line 14, in build_base_equipment_from_string
web-1  |     base_equipment = await generator(extraction_prompt)  # Await added here
web-1  |   File "/usr/local/lib/python3.10/site-packages/outlines/models/openai.py", line 145, in __call__
web-1  |     response, prompt_tokens, completion_tokens = generate_chat(
web-1  |   File "/usr/local/lib/python3.10/site-packages/outlines/base.py", line 61, in __call__
web-1  |     return self.call_with_signature(*args, **kwargs)
web-1  |   File "/usr/local/lib/python3.10/site-packages/outlines/base.py", line 166, in call_with_signature
web-1  |     outputs = self.vectorize_call_coroutine(broadcast_shape, args, kwargs)
web-1  |   File "/usr/local/lib/python3.10/site-packages/outlines/base.py", line 255, in vectorize_call_coroutine
web-1  |     outputs = loop.run_until_complete(create_and_gather_tasks())
web-1  |   File "uvloop/loop.pyx", line 1512, in uvloop.loop.Loop.run_until_complete
web-1  |   File "uvloop/loop.pyx", line 1505, in uvloop.loop.Loop.run_until_complete
web-1  |   File "uvloop/loop.pyx", line 1379, in uvloop.loop.Loop.run_forever

Steps/code to reproduce the bug:

My Dockerfile : 

FROM python:3.10-slim

ENV PYTHONUNBUFFERED True

ENV APP_HOME /app
WORKDIR $APP_HOME
COPY ./requirements.txt $APP_HOME/requirements.txt

RUN apt update && apt install -y libpq-dev python3-dev gcc
RUN pip install --no-cache-dir -r requirements.txt

COPY ./app $APP_HOME/app

CMD ["gunicorn", "--bind", ":$PORT", "--worker-class", "uvicorn.workers.UvicornWorker", "--workers", "1", "--threads", "8", "--timeout", "0", "app.main:app"]

I use outlines generate like this :

async def build_base_equipment_from_string(text: str):
    model = get_equipment_model()
    generator = outlines.generate.json(model, EquipmentBase)

    extraction_prompt = extraction(equipment_description=text)
    base_equipment = await generator(extraction_prompt)  # Await added here
    return base_equipment

and the router

@router.post("/text", response_model=EquipmentOut)
async def text_to_equipment(equipment_text: EquipmentTextInput):
    return await create_equipment_from_text(equipment_text.text)

### Expected result:

```shell
It needs to runs well and return the schema extracted

Error message:

No response

Outlines/Python version information:

Version information

``` outlines==0.1.1 fastapi==0.108.0 pydantic==2.5.3 ```

Context for the issue:

No response