Open markpollack opened 1 week ago
For the chat models and the embedding models that are in OpenAiApi add logging at the debug level.
OpenAiApi
This should also be done for streaming endpoints using the technique in PromptChatMemoryAdvisor, e.g.
PromptChatMemoryAdvisor
@Override public Flux<ChatResponse> adviseResponse(Flux<ChatResponse> fluxChatResponse, Map<String, Object> context) { return new MessageAggregator().aggregate(fluxChatResponse, chatResponse -> { List<Message> assistantMessages = chatResponse.getResults() .stream() .map(g -> (Message) g.getOutput()) .toList(); this.getChatMemoryStore().add(this.doGetConversationId(context), assistantMessages); }); }
I shared some thoughts about this in https://github.com/spring-projects/spring-ai/issues/512#issuecomment-2185096414
For the chat models and the embedding models that are in
OpenAiApi
add logging at the debug level.This should also be done for streaming endpoints using the technique in
PromptChatMemoryAdvisor
, e.g.