chrisrude / oobabot

A Discord bot which talks to Large Language Model AIs running on oobabooga's text-generation-webui
MIT License
101 stars 34 forks source link

splitting responses + references == messiness #38

Closed chrisrude closed 1 year ago

chrisrude commented 1 year ago

@chrisrude something i've noticed with this is that with splitting responses on, every single split response replies to the same message, which can look a little messy. Off the top of my head there are two obvious solutions: either only having the first split part of the response reply to the request message (which i guess would make it unclear when the response ended some times) or having the response be in a single message (which is very humanlike but could be slow). I think the second solution could be more elegant, and it could also be used to reuse the deprecated response streaming feature: instead of dumping the entire response at once, the bot would split it up as usual, but instead of sending them as different messages, it could edit the original message split response by split response (my impressions are that this would call the discord API far less times that the original text streaming implementation and work way more smoothly than the original implementation)

Originally posted by @rebek43 in https://github.com/chrisrude/oobabot/issues/31#issuecomment-1564488716

chrisrude commented 1 year ago

Playing around with this, it seems that having it just reply-mention the very first message in a chain works the best. The reason is that the requests to oobabooga should be serialized anyway, so we will always finish off the first reply in its entirety before starting on an unsolicited reply.

Ironically, this is how things worked in the very first implementation. :)