langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.75k stars 2.2k forks source link

response_metadata not present on messages when using RemoteRunnable #7195

Open andyvauliln opened 1 week ago

andyvauliln commented 1 week ago

Checked other resources

Example Code

import { RemoteRunnable } from "@langchain/core/runnables/remote"; 
 const url = 'http://127.0.0.1:8087/agents/product_agent/chat';
    const remoteChain = new RemoteRunnable({
        url: url
    });

const stream = await remoteChain.stream(
                {
                    messages: [{
                        content: messageValue,
                        type: "human",
                    }]
                },
                {
                    configurable: {
                        user_id: "andrei",
                        agent_id: "product_agent",
                        thread_id: "8",
                    }
                }
            );
for await (const chunk of stream) {
                console.log("Raw chunk:", chunk);

            }

"kwargs": { "messages": [ { "lc": 1, "type": "constructor", "id": [ "langchain_core", "messages", "AIMessage" ], "kwargs": { "content": "Here is Suggested Products", "tool_calls": [], "invalid_tool_calls": [], "additional_kwargs": {}, "response_metadata": {} // empy, but langgraph gives response with set data } } ] }

Error Message and Stack Trace (if applicable)

No response

Description

Problem is that when i set response_metadata on message in langgrpah it's not present in a message object and shows empty object response_metadata: {}

System Info

version: '0.3.5', type: 'module', engines: { node: '>=18' }, platform mac

dosubot[bot] commented 1 week ago

Hey there, @andyvauliln! I'm here to help you with LangChain.js. I can assist you with bugs, questions, and becoming a contributor. Let's tackle this together while we wait for a human maintainer.

The issue you're experiencing with response_metadata being empty in the AIMessageChunk objects when using RemoteRunnable might be due to how the stream method processes and returns message data. In the examples provided, response_metadata is included in the structure but appears as an empty object ({}) for each chunk. This suggests that while response_metadata is part of the message structure, it might not be populated with data in certain contexts or configurations [1].

To address this, you might want to check the configuration and ensure that the remote agent is correctly set up to populate response_metadata. If the remote service (e.g., LangGraph) is supposed to provide this metadata, ensure that it is configured to do so and that the data is being correctly transmitted and received by the RemoteRunnable.

If the issue persists, it could be a bug or limitation in the current version of LangChain.js. Since you've already confirmed that updating to the latest version doesn't resolve the issue, consider checking for any updates or patches that might address this specific problem.

To continue talking to Dosu, mention @dosu.