buhe / langchain-swift

🚀 LangChain for Swift. Optimized for iOS, macOS, watchOS (part) and visionOS.(beta)
https://buhe.dev
Apache License 2.0
323 stars 35 forks source link

Chat Stream #39

Closed Tobaibrahim closed 1 year ago

Tobaibrahim commented 1 year ago

Its unclear how to get the stream output from a chat, I assumed you could do something like this using AsyncSequence -

 let output = await chatgpt_chain.predict(args: input)
            for try await line in output {
                print(line)
        }

Please show an example of how to use it with chats. Thanks

buhe commented 1 year ago

Good question. You must be use ChatOpenAI model and

        let eventLoopGroup = MultiThreadedEventLoopGroup(numberOfThreads: 1)

        let httpClient = HTTPClient(eventLoopGroupProvider: .shared(eventLoopGroup))

        defer {
            // it's important to shutdown the httpClient after all requests are done, even if one failed. See: https://github.com/swift-server/async-http-client
            try? httpClient.syncShutdown()
        }
        let llm = ChatOpenAI(httpClient: httpClient, temperature: 0.8)
        let answer = await llm.send(text: "Hey")
        let writerText = ""
            for try await c in answer.generation! {
                if let message = c.choices.first?.delta.content {
                    writerText += message
                }
            }
buhe commented 1 year ago

If you have any good advice about APIs, don't hesitate to mention them.

Tobaibrahim commented 1 year ago

Thanks for the quick response. It's all working now! If anyone in the future needs a code sample I can provide that. ❤️ You also have to import AsyncHTTPClient and NIO to get this working.

buhe commented 1 year ago

NP, feel free open this issue again ;)