Closed mumengmeng closed 1 year ago
supported, see StreamOpenAIExample
@RunnableExample
public class StreamOpenAIExample {
public static void main(String[] args) {
var llm = OpenAI.builder()
.maxTokens(1000)
.temperature(0)
.requestTimeout(120)
.build()
.init();
var result = llm.asyncPredict("Introduce West Lake in Hangzhou, China.");
result.doOnNext(System.out::print).blockLast();
}
}
niubility!
可以流式输出吗,直接等待完成太长时间了?