Open Petopp opened 1 month ago
Flowise Version 1.7.2
also want to know how to handle the stream with python
for now you will have to use socket IO for streaming - https://docs.flowiseai.com/using-flowise/streaming we're working on to change that to SSE
Hey Henry, how is it going with the SSE? @HenryHengZJ
Hi Team,
Could you please help me with this Python code?
The nodejs version is working fine.
import asyncio
import json
import aiohttp
import socketio
SERVER = "http://localhost:54000"
sio = socketio.AsyncClient(logger=True, engineio_logger=True)
async def query(data):
async with aiohttp.ClientSession() as session:
async with session.post(
f"{SERVER}/api/v1/prediction/7af1f9f3-dd43-4ce4-b76e-bf45103010f5",
headers={"Content-Type": "application/json"},
data=json.dumps(data),
) as response:
result = await response.json()
return result
@sio.event
async def connect():
print('connected to the server')
question = "Hey, how are you?"
result = await query({"question": question, "socketIOClientId": sio.sid})
print(json.dumps(result))
@sio.on('start')
async def start():
print("start event received")
@sio.on('token')
async def token(token):
print(f"token event received with token: {token}")
@sio.on('end')
async def end():
print("end event received")
await sio.disconnect()
async def main():
await sio.connect(SERVER)
await sio.wait()
asyncio.run(main())
It's not streaming anything. None of the @sio.on
decorators are getting triggered.
Hello everyone,
I am trying to process the response from the LLM from a Flowise endpoint in a structured way, e.g. to have it output in Streamlit as you know it from OpenAI etc..
Unfortunately I am not able to do this. The response as such works, but unfortunately not as a stream.
Does anyone have any ideas on how to do this?
Here the Flowise part: