FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
27.31k stars 14.1k forks source link

No streaming with Python #2442

Open Petopp opened 1 month ago

Petopp commented 1 month ago

Hello everyone,

I am trying to process the response from the LLM from a Flowise endpoint in a structured way, e.g. to have it output in Streamlit as you know it from OpenAI etc..

Unfortunately I am not able to do this. The response as such works, but unfortunately not as a stream.

import requests

API_URL = "http://192.168.0.133:7000/api/v1/prediction/e8c074c0-0956-4cdf-9786-86b0aa47a989"

def query(payload):

    response = requests.post(API_URL, json=payload, stream=True)

    if response.status_code == 200:
        for line in response.iter_lines():
            if line:
                # Verarbeitung der Streaming-Daten hier
                data = line.decode('utf-8')
                print("Stream:", data)

    else:
        print("Error:", response.status_code)

# Beispielabfrage
query({
    "question": "How fast it's the light?",
    "overrideConfig": {
        "sessionId": "user1"
    }
})

Does anyone have any ideas on how to do this?

Here the Flowise part:

image

Petopp commented 1 month ago

Flowise Version 1.7.2

csningli commented 1 month ago

also want to know how to handle the stream with python

HenryHengZJ commented 1 month ago

for now you will have to use socket IO for streaming - https://docs.flowiseai.com/using-flowise/streaming we're working on to change that to SSE

xu-dong-bl commented 3 weeks ago

Hey Henry, how is it going with the SSE? @HenryHengZJ

saidharanidhar commented 3 days ago

Hi Team,

Could you please help me with this Python code?

The nodejs version is working fine.

import asyncio
import json
import aiohttp
import socketio

SERVER = "http://localhost:54000"

sio = socketio.AsyncClient(logger=True, engineio_logger=True)

async def query(data):
    async with aiohttp.ClientSession() as session:
        async with session.post(
                f"{SERVER}/api/v1/prediction/7af1f9f3-dd43-4ce4-b76e-bf45103010f5",
                headers={"Content-Type": "application/json"},
                data=json.dumps(data),
        ) as response:
            result = await response.json()
            return result

@sio.event
async def connect():
    print('connected to the server')
    question = "Hey, how are you?"
    result = await query({"question": question, "socketIOClientId": sio.sid})
    print(json.dumps(result))

@sio.on('start')
async def start():
    print("start event received")

@sio.on('token')
async def token(token):
    print(f"token event received with token: {token}")

@sio.on('end')
async def end():
    print("end event received")
    await sio.disconnect()

async def main():
    await sio.connect(SERVER)
    await sio.wait()

asyncio.run(main())

It's not streaming anything. None of the @sio.on decorators are getting triggered.