Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.1k stars 777 forks source link

Running an extended API endpoint with init_http_context() but chainlit API (cl.Message) isn't working #907

Open antoineross opened 2 months ago

antoineross commented 2 months ago

I'm just using the code from the documentation:

from chainlit.server import app
from fastapi import Request
from fastapi.responses import HTMLResponse
from chainlit.context import init_http_context
import chainlit as cl

@app.get("/hello")
async def hello(request: Request):
    init_http_context()
    await cl.Message(content="Hello World").send()
    return HTMLResponse("Hello World")

And my environment is:

python = 3.11
chainlit ==  1.0.502
literalai == 0.0.500
openai == 1.17.1

I have tried the following:

  1. initiating the http context on either/both on api endpoint and on chat start, but it seems like it doesn't work.
  2. I've added a literal_api_key to my environment variable and ran LITERAL_API_KEY="your key" chainlit run main.py, still doesn't work
  3. I've tried doing a websocket session on the api endpoint and still doesnt work: code for ws:
    @app.get("/hello/{session_id}")
    async def hello(request: Request, session_id: str):
    ws_session = WebsocketSession.get_by_id(session_id=session_id)
    init_ws_context(ws_session)
    await cl.Message(content="Hello World").send()
    return HTMLResponse("Data sent to the websocket client")

Have I missed anything?

peiga commented 1 month ago

I can confirm, any progress on this issue yet?