langchain-ai / langgraphjs

⚡ Build language agents as graphs ⚡
https://langchain-ai.github.io/langgraphjs/
MIT License
322 stars 50 forks source link

graph.addEdge method fails when adding an edge to END #184

Closed jamesfdavis closed 1 month ago

jamesfdavis commented 1 month ago

When attempting to add an edge from a node to END in a MessageGraph, the graph.addEdge method fails. The method signature seems to indicate that the endKey parameter should accept typeof END, but it throws an error instead.

Steps to Reproduce

  1. Import the necessary modules.
  2. Create a new MessageGraph.
  3. Add a node.
  4. Attempt to add an edge to END.

Expected Behavior

The edge should be added without errors.

Actual Behavior

The method throws an error indicating a type mismatch.

Argument of type '"oracle"' is not assignable to parameter of type '"__start__" | "__start__"[]'.

Versions

Code

import { ChatOpenAI } from "@langchain/openai";
import { BaseMessage } from "@langchain/core/messages";
import { END, MessageGraph } from "@langchain/langgraph";

const model = new ChatOpenAI({ temperature: 0 });

const graph = new MessageGraph();

graph.addNode("oracle", async (state: BaseMessage[]) => {
  return model.invoke(state);
});

try {
  graph.addEdge("oracle", END);  // This line fails
} catch (error) {
  console.error(error);
}

graph.setEntryPoint("oracle");
const runnable = graph.compile();

Operating System

ProductName: macOS ProductVersion: 14.5 BuildVersion: 23F79

Runtime

NodeJS: v20.13.1

Final Thoughts

After testing the addEdge method in other projects, the error seems very consistent and does not seem to be only associated with the END const variable. This error also happens with StateGraph implementation.

hinthornw commented 1 month ago

Does it work if you create it functionally (like I do in a lot of the how tos)

const graph = new MessageGraph().addNode("oracle", async (state: BaseMessage[]) => {
  return model.invoke(state);
});
jamesfdavis commented 1 month ago

@hinthornw - After reviewing the how-to examples, I found that I was not chaining the builder.

I adapted my example from the README, so this might be a change the team wants to include in the README.

import { ChatOpenAI } from "@langchain/openai";
import { BaseMessage, } from "@langchain/core/messages";
import { START, END, MessageGraph } from "@langchain/langgraph";

const model = new ChatOpenAI({ temperature: 0 });
const graph = new MessageGraph()
.addNode("oracle", async (state: BaseMessage[]) => {
  return model.invoke(state);
})
.addEdge("oracle", END)
.addEdge(START,"oracle");

const runnable = graph.compile();

Thanks for the feedback, it had me digging deeper for an answer.

hinthornw commented 1 month ago

Thanks for sharing! I should fix up the generic typing in the readme

gruckion commented 1 month ago

This issue should not be closed as the docs have not been updated.

https://github.com/langchain-ai/langgraphjs

import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, BaseMessage, } from "@langchain/core/messages";
import { END, MessageGraph } from "@langchain/langgraph";

const model = new ChatOpenAI({ temperature: 0 });

const graph = new MessageGraph();

graph.addNode("oracle", async (state: BaseMessage[]) => {
  return model.invoke(state);
});

graph.addEdge("oracle", END);

graph.setEntryPoint("oracle");

const runnable = graph.compile();

Should be updated to

import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, BaseMessage } from "@langchain/core/messages";
import { END, MessageGraph } from "@langchain/langgraph";

const model = new ChatOpenAI({ temperature: 0 });

const graph = new MessageGraph().addNode(
  "oracle",
  async (state: BaseMessage[]) => {
    return model.invoke(state);
  }
);

graph.addEdge("oracle", END);

graph.setEntryPoint("oracle");

const runnable = graph.compile();

Or

import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, BaseMessage } from "@langchain/core/messages";
import { END, MessageGraph, START } from "@langchain/langgraph";

require("dotenv").config();

const model = new ChatOpenAI({ temperature: 0 });

const graph = new MessageGraph().addNode(
  "oracle",
  async (state: BaseMessage[]) => {
    return model.invoke(state);
  }
);

graph.addEdge("oracle", END);

graph.addEdge(START, "oracle");

const runnable = graph.compile();
jamesfdavis commented 1 month ago

Re-opening the issue until it can be updated in the docs. re: @gruckion

hinthornw commented 1 month ago

Alright, all, I went through all the tutorials in the docs, all the how-tos, and the README and rewrote them, compiled with TSC and ran in typescript with no complaints. Hopefully this resolves this issue. WIll keep open another day or two to let others push back but then will close

Thanks again to everyone on this thread (and others) for your patience and for calling out the breaks - going through this process definitely highlighted how much of a pain in the rear it is to go from the deno docs to TS

afuentes commented 1 month ago

I am using this approach as a workaround to delete the below message. Argument of type '"oracle"' is not assignable to parameter of type '"start" | "start"[]'.

const graph = new MessageGraph()

....

graph.addEdge(START,"oracle" as any)
graph.addEdge("oracle" as any ,END)

return graph
hinthornw commented 1 month ago

@afuentes have you tried adding the nodes first before the edges?

const graph = new MessageGraph()
    .addNode("oracle", ...);
graph.addEdge(START, "oracle");
graph.addEdge("oracle" as any ,END)
hinthornw commented 1 month ago

Seems like folks are no longer running into this? As always, feel free to re-open if there are still issues here