open-webui / pipelines

Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework
MIT License
1.01k stars 319 forks source link

Pipeline with llamaindex streaming reponse does not work #275

Closed Zeeeeta closed 2 months ago

Zeeeeta commented 2 months ago

I am using llamaindex with Open WebUI pipeline, following the example on https://github.com/open-webui/pipelines/blob/main/examples/pipelines/rag/llamaindex_pipeline.py

However, if I try to use streaming reponse like the example, returning with "return response.response_gen" Open WebUI is unable to display any result and leaving the chat as below

{E90B6BC6-0250-4882-A120-0A6C8F50ECF4} There is no error.

Tried to run llamaindex streaming response locally with "response.print_response_stream()" does print the result word by word.

Any help would be appreciated, thank you.

Edit: My bad, my logging of debug messages caused that.