HamaWhiteGG / langchain-java

Java version of LangChain, while empowering LLM for Big Data.
Apache License 2.0
545 stars 106 forks source link

supported stream response? #60

Closed kael-aiur closed 12 months ago

kael-aiur commented 1 year ago

I am using RetrievalQa chain to build a document-based conversational tool, but every time I ask a question about the content of the document, I have to wait for the large language model to complete the entire answer. Sometimes it takes a long time, and I am not sure if there is an error. Therefore, I am considering whether we can support a streaming response interface for this conversational chain.

HamaWhiteGG commented 1 year ago

Can you provide the code? I want to confirm whether it's not about vectorization or the time-consuming process of inserting text data into the vector database.

kael-aiur commented 1 year ago

yes, It is not vectorization, It like async api in python, I'm try to defined in my fork project, and now without any implement for llm client, I just using in my private llm server, and implement by my private llm client for preview, I can commit a pull request to this project.

HamaWhiteGG commented 1 year ago

Good, looking forward to your pull request.

ChildWangWorld commented 1 year ago

great job

HamaWhiteGG commented 1 year ago

supported, see StreamOpenAIExample

@RunnableExample
public class StreamOpenAIExample {

    public static void main(String[] args) {

        var llm = OpenAI.builder()
                .maxTokens(1000)
                .temperature(0)
                .requestTimeout(120)
                .build()
                .init();

        var result = llm.asyncPredict("Introduce West Lake in Hangzhou, China.");
        result.doOnNext(System.out::print).blockLast();
    }
}