Lambdua / openai4j

Java client library for OpenAI API.Full support for all OpenAI API models including Completions, Chat, Edits, Embeddings, Audio, Files, Assistants-v2, Images, Moderations, Batch, and Fine-tuning.
MIT License
331 stars 31 forks source link

新增 ChatMessageAccumulatorWrapper,以便于能够在使用chatCompletion时使用原始 chunk 数据… #70

Closed big-mouth-cn closed 1 month ago

big-mouth-cn commented 1 month ago

新增 ChatMessageAccumulatorWrapper,以便于能够在使用chatCompletion时使用原始 chunk 数据。这一需求主要适用于chat API的转发场景。

Flowable<ChatCompletionChunk> streamedChatCompletion = openAiService.streamChatCompletion(requestBody);

AssistantMessage accumulatedMessage = openAiService.mapStreamToAccumulatorWrapper(streamedChatCompletion)
        .doOnNext((chatMessageAccumulatorWrapper -> {
            ChatMessageAccumulator chatMessageAccumulator = chatMessageAccumulatorWrapper.getChatMessageAccumulator();
            if (!chatMessageAccumulator.isFunctionCall()) {
                ChatCompletionChunk chatCompletionChunk = chatMessageAccumulatorWrapper.getChatCompletionChunk();
                String source = chatCompletionChunk.getSource();
                byte[] bytes = source.getBytes(StandardCharsets.UTF_8);
                if (null != bytes) {
                    sseEmitter.send(SseEmitter.event().data(bytes, MediaType.APPLICATION_JSON_UTF8));
                }
            }
        }))
        .doOnError(throwable -> {
            sseEmitter.completeWithError(throwable);
            log.error("streamChatCompletion error", throwable);
        })
        .lastElement()
        .blockingGet()
        .getChatMessageAccumulator()
        .getAccumulatedMessage();