Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.76k stars 878 forks source link

AskUserMessage always takes the full timeout time to complete #319

Closed monaxu1 closed 1 year ago

monaxu1 commented 1 year ago

Version: chainlit==0.6.2

Problem: Time taken to get the response from AskUserMessage is always the full timeout time (60 secs).

import time
import chainlit as cl

time_start = time.time()
res = await cl.AskUserMessage(content=question, timeout=60).send()
time_end = time.time()
print(f"Async: Time taken (secs): {time_end - time_start}")
willydouhard commented 1 year ago

Are you using AskUserMessage in either cl.on_chat_start or cl.on_message?

monaxu1 commented 1 year ago

It's with @cl.on_chat_start

willydouhard commented 1 year ago

This is the code of chainlit hello:

# This is a simple example of a chainlit app.

from chainlit import AskUserMessage, Message, on_chat_start

@on_chat_start
async def main():
    res = await AskUserMessage(content="What is your name?", timeout=30).send()
    if res:
        await Message(
            content=f"Your name is: {res['content']}.\nChainlit installation is working!\nYou can now start building your own chainlit apps!",
        ).send()

Both this code and chainlit hello works on my end, can you confirm?

monaxu1 commented 1 year ago

Actually, I'm using it in a function that is called by the main fun with cl.on_chat_start

async def async_ask_question(question: str) -> str:
    print("Async: Asking question")
    time_start = time.time()
    res = await cl.AskUserMessage(content=question, timeout=60).send()
    time_end = time.time()
    print(f"Async: Time taken (secs): {time_end - time_start}")
    if res:
        return res["content"]
    else:
        return "No response"

@cl.on_chat_start
async def on_chat_start():
    msg = cl.Message(content="Setting up model...")
    await msg.send()

    # Instantiate the agent for that user session
    agent = build_qa_pipeline(
        llm_model_name=constants.LLM_MODEL_NAME,
        ask_question_fn=async_ask_question,
    )

    # Let the user know that the system is ready
    msg.content = "Model set up. You can now ask questions!"
    await msg.update()

    # Store the agent in the user session
    cl.user_session.set("llm_agent", agent)
willydouhard commented 1 year ago

does chainlit hello work on your end?