Open prabirshrestha opened 4 months ago
Currently I need to write a lot of boilerplate code when the streaming has ended. this also means I need to add async_openai as dependeny.
let mut stream = chain.stream(input_variables).await?; while let Some(result) = stream.next().await { match &result { Ok(data) => data.to_stdout()?, Err(ChainError::LLMError(LLMError::OpenAIError( OpenAIError::StreamError(e), ))) => { if e == "Stream ended" { break; } else { error!("OpenAI Error: {:?}", e); break; } } Err(e) => panic!("Errorx: {:?}", e), } }
This could be simplified to use Result<Option<StreamData>>.
Result<Option<StreamData>>
let mut stream = chain.stream(input_variables).await?; while let Some(result) = stream.next().await { match &result { Ok(Some(data)) => data.to_stdout()?, None => break, Err(e) => panic!("Errorx: {:?}", e), } }
Currently I need to write a lot of boilerplate code when the streaming has ended. this also means I need to add async_openai as dependeny.
This could be simplified to use
Result<Option<StreamData>>
.