sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.3k stars 128 forks source link

Chain conversation: ability to send a message and receive a stream feature request #231

Open JohnTheCoolingFan opened 10 months ago

JohnTheCoolingFan commented 10 months ago

Currently Chain::send_message forces the output to be of Immediate type, which is not suitable for my use.

Perhaps a method that doesn't consume the stream but instead makes a different stream, one which also collects the streaming output and appends to the history? async-stream crate might be helpful with that.

JohnTheCoolingFan commented 10 months ago

After trying to impleemnt this myself in a way that would be nice, I came to a conclusion that this is not possible with the current design of the api. I'll reuse the code of Chain in my project but will adjust it the way I need it to work.

Also, why put Output as the return type of send_message if it always returns Output::Immediate?.. Just return the underlying type without being ambigious that the stream is always consumed and turned into an immediate response.

LUK3ARK commented 9 months ago

Hey @JohnTheCoolingFan did you have any luck with this?