Open SimonPrammer opened 1 year ago
This would probably work best with LangChain callbacks and the standard .stream()
method like in the source-returning example here:
Think this would be a helpful feature generally - it would remove the need to use headers and would greatly simplify frontend parsing with SDK. Currently, we have to handle the response directly and extract the headers in an onResponse
callback:
https://github.com/langchain-ai/langchain-nextjs-template/blob/main/components/ChatWindow.tsx#L43
Feature Description
It would be absolutely amazing if Vercel AI SDK's new "experimental" StreamData would support getting data from langchains streamLog function.
StreamLog contains log data from the entire run, and is especially useful for extracting metadata from a retriever.
Chat Langchain is using it to display the citations of their sources: https://chat.langchain.com/
PS: I would love to test StreamData but its not inside the SvelteKit implementation of useChat() yet :/
Use Case
Would enable us to extract Metadata from a LangChain run and display it to the user as citations so he can verify the LLMs sources.
Additional context
No response