Open wi0lono opened 7 months ago
@wi0lono Can you please share more details regarding the issue? Do you mean output returned to channel?
Yes. Within my FSM, I have:
self.send_message("message1")
# long running process
self.send_message("message2")
Ideally message1
should be returned to the user as soon as the first send_message()
runs, but it seems to be returning them all at once, when the FSM waits for input/ends.
At the moment this should be handled in FSM design. If you break this into 2 different states:
state n:
self.send_message("message1")
# automatically move forward
state n+1:
# long process
self. send_message("message 2")
We should create a separate item for streaming version of JB-Manager that can work with custom channels that support streaming interface.
My sense is that this is going to be a significant change in architecture (creating a v3) and therefore we should do this as a prototype outside of this repo first.
@sameersegal
At the moment this should be handled in FSM design. If you break this into 2 different states:
state n: self.send_message("message1") # automatically move forward state n+1: # long process self. send_message("message 2")
The suggested approach won't work as currently output from FSM reaches flow main process only when
Dividing into separate state won't solve the issue of getting FSM output in real time as the output buffers before going to flow.
This behavior is due to the fact that subprocess.run
only returns when the fsm runner process completes(which happens in one of the above mentioned conditions).
One way to solve this issue would be to use FIFO pipe to communicate between FSM runner process and flow main process.
I understand it better now. Thanks
Let's look at this in a bit
@sameersegal and @shreypandey, could you please provide an update on this?
Need more information to be converted to C4GT issue template.
The below issue can also be fixed with this fix:
Describe the bug: While in a state, if the user sends multiple messages, the bot can end up producing junk and sending back a range of responses, sometimes even resetting the state. Steps to reproduce
In any state, send more than one message to the bot. Expected Behavior
Bot should ideally terminate earlier request and resend request with new context/messages and process once.
This is a big/complex Architectural change. Design needs to be done. So will need further detailed discussions on the way forward.
Varun suggested another option with thinking of Time as a Delimiter - take all requests from the User until the user stops and then process everything together as a single message. @KaranrajM and @DevvStrange to brainstorm and get back with their thoughts by this Friday (18th Oct)
@DevvStrange , You were brainstorming on this topic, isnt it? Any update?
@Lekhanrao Yes we did analyse on the solution that was shared by Varun, the takeaway was like setting the time as a delimiter would not be a good user experience, as every request waits for certain time say 10 seconds after which it starts the processing. Also this time of 10 seconds might not be ideal for sending a followup message from user's perspective as well. So we should brainstorm further for a better solution.
Hi @DevvStrange,
This issue is due to the fact that subprocess.run
call in bot_input.py
only returns after fsm_runner finishes execution. You can explore pipe based approach in subprocess call which actually streams the response in real time. This will handle this issue.
Something like this:
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
# Stream the output
try:
for line in iter(process.stdout.readline, ''):
print(line, end='') # Output in real-time
finally:
process.stdout.close()
process.stderr.close()
process.wait()
Describe the issue
The FSM doesn't stream output (return it in real time). Output is returned only when the fsm waits for input or after it has exited.
Steps to reproduce
No response
Screenshots and logs
No response
Additional Information
No response