Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.76k stars 878 forks source link

Display a loader #599

Closed sergej-d closed 8 months ago

sergej-d commented 9 months ago

Hello,

how do I get the loading circuit as in this example (https://docs.chainlit.io/chat-experience/loader). Unfortunately I only get an empty message while waiting for the loading process.

Thank you!

willydouhard commented 9 months ago

Do you run the exact same code? What version of chainlit are you using?

sergej-d commented 9 months ago

yes, i run the same code.

aiofiles 23.2.1 aiohttp 3.9.1 aiosignal 1.3.1 annotated-types 0.6.0 anyio 3.7.1 asgiref 3.7.2 asyncer 0.0.2 attrs 23.1.0 backoff 2.2.1 bcrypt 4.1.1 bidict 0.22.1 cachetools 5.3.2 certifi 2023.11.17 chainlit 0.7.700 charset-normalizer 3.3.2 chroma-hnswlib 0.7.3 chromadb 0.4.18 click 8.1.7 coloredlogs 15.0.1 dataclasses-json 0.5.14 Deprecated 1.2.14 distro 1.8.0 dnspython 2.4.2 fastapi 0.100.1 fastapi-socketio 0.0.10 filelock 3.13.1 filetype 1.2.0 flatbuffers 23.5.26 frozenlist 1.4.0 fsspec 2023.10.0 google-auth 2.23.4 googleapis-common-protos 1.61.0 grpcio 1.59.3 h11 0.14.0 httpcore 0.17.3 httptools 0.6.1 httpx 0.24.1 huggingface-hub 0.19.4 humanfriendly 10.0 idna 3.6 importlib-metadata 6.8.0 importlib-resources 6.1.1 jsonpatch 1.33 jsonpointer 2.4 kubernetes 28.1.0 langchain 0.0.342 langchain-core 0.0.7 langsmith 0.0.67 Lazify 0.4.0 loguru 0.7.2 marshmallow 3.20.1 mmh3 4.0.1 monotonic 1.6 mpmath 1.3.0 multidict 6.0.4 mypy-extensions 1.0.0 nest-asyncio 1.5.8 numexpr 2.8.8 numpy 1.26.2 oauthlib 3.2.2 onnxruntime 1.16.3 openai 1.3.6 opentelemetry-api 1.21.0 opentelemetry-exporter-otlp 1.21.0 opentelemetry-exporter-otlp-proto-common 1.21.0 opentelemetry-exporter-otlp-proto-grpc 1.21.0 opentelemetry-exporter-otlp-proto-http 1.21.0 opentelemetry-instrumentation 0.42b0 opentelemetry-instrumentation-asgi 0.42b0 opentelemetry-instrumentation-fastapi 0.42b0 opentelemetry-proto 1.21.0 opentelemetry-sdk 1.21.0 opentelemetry-semantic-conventions 0.42b0 opentelemetry-util-http 0.42b0 overrides 7.4.0 packaging 23.2 pinecone-client 2.2.4 pip 23.2.1 posthog 3.0.2 protobuf 4.25.1 pulsar-client 3.3.0 pyasn1 0.5.1 pyasn1-modules 0.3.0 pydantic 2.5.2 pydantic_core 2.14.5 PyJWT 2.8.0 PyMuPDF 1.23.7 PyMuPDFb 1.23.7 pypdf 3.17.1 PyPDF2 3.0.1 PyPika 0.48.9 python-dateutil 2.8.2 python-dotenv 1.0.0 python-engineio 4.8.0 python-graphql-client 0.4.3 python-multipart 0.0.6 python-socketio 5.10.0 PyYAML 6.0.1 regex 2023.10.3 requests 2.31.0 requests-oauthlib 1.3.1 rsa 4.9 setuptools 68.1.2 simple-websocket 1.0.0 six 1.16.0 sniffio 1.3.0 SQLAlchemy 2.0.23 starlette 0.27.0 sympy 1.12 syncer 2.0.3 tenacity 8.2.3 tiktoken 0.5.1 tokenizers 0.15.0 tomli 2.0.1 tqdm 4.66.1 typer 0.9.0 typing_extensions 4.8.0 typing-inspect 0.9.0 uptrace 1.21.0 urllib3 1.26.18 uvicorn 0.23.2 uvloop 0.19.0 watchfiles 0.20.0 websocket-client 1.6.4 websockets 12.0 wrapt 1.16.0 wsproto 1.2.0 yarl 1.9.3 zipp 3.17.0

Do you have any idea what could be wrong with my configuration?

PyroGenesis commented 8 months ago

I have the same symptom, but the example code is fine. Turns out there needs to be an cl.sleep() call after msg.send() otherwise the loader never displays.

So this code never displays the loader:

@cl.on_message
async def run_conversation(message: cl.Message):
    msg = cl.Message(content="")
    await msg.send()

    # do a bunch of sync and async operations here

    msg.content += result_text
    await msg.update()

but this code does:

@cl.on_message
async def run_conversation(message: cl.Message):
    msg = cl.Message(content="")
    await msg.send()
    await cl.sleep(0) # extra line here

    # do a bunch of sync and async operations here

    msg.content += result_text
    await msg.update()
willydouhard commented 8 months ago

Because it is executing too fast I believe. If you do some processing before the .send() and .update() the loader should display.