langchain-ai / langgraph-studio

Desktop app for prototyping and debugging LangGraph applications locally.
https://studio.langchain.com
1.84k stars 119 forks source link

Context object in `pydantic` state opens and closes connections multiple times between node execution #66

Closed zboyles closed 1 month ago

zboyles commented 2 months ago

Replicate findings by following the tutorial How to use a context object in state and specify a default value for the context field.

Replication Requirement

The context field needs a default value or else the graph execution in LCS will fail after the first node and, if running the graph in the terminal/notebook, it completely fails to run.

I also realized I added a get method to the state class because ToolNode assumes your state is a TypedDict. This may also be another issue, but I haven't looked into it yet. In any case, I left the logging to show when it's accessing messages.

from langchain.pydantic_v1 import BaseModel, Field
from langgraph.prebuilt import ToolNode

class AgentState(BaseModel):
    messages: Annotated[Sequence[BaseMessage], operator.add]
    # default to `None` to allow execution in both environments
    context: Annotated[AgentContext, Context(make_agent_context)] = None

    # Fix to enable `ToolNode` which attempts to access `state.messages`
    # via `get` as it assumes the state is a `TypedDict` instead of the far
    # superior pydantic `BaseModel`
    def get(self, key: str, default: Any = None) -> Any:
        """Support dict-like access."""
        print(f"GETTING: {key}")
        return getattr(self, key, default)

Troubleshooting

Here is some very minimal logging to see what's going on 👀

@asynccontextmanager
async def make_agent_context(config: RunnableConfig):
    asession = httpx.AsyncClient()
    try:
        print("Yielding Httpx Context")
        yield HttpxContext(session=asession)
    finally:
        try:
            print("Closing Httpx Session")
            await asession.aclose()
        except Exception as e:
            print(f"Error closing session:")
            print(e)
    print("Finished closing sessions")

Logging

This occurs just when LCS gains focus.

langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions

Running the tutorial, with my logging modification, and What's the current weather in SF? as input, called Tavily and used the httpx context to get langchain.com.

langgraph-api-1       | 127.0.0.1:39572 - "GET /ok HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:21821 - "GET /assistants/ASS1STNT-0000-0000-0000-000000000000/graph HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:35905 - "GET /assistants/ASS1STNT-0000-0000-0000-000000000000/schemas HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:18082 - "POST /assistants/search HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:19090 - "POST /threads/search HTTP/1.1" 200
langgraph-api-1       | 127.0.0.1:54794 - "GET /ok HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:57438 - "OPTIONS /threads HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:57438 - "POST /threads HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:57438 - "OPTIONS /threads/00000000-0000-0000-0000-000000000000/runs/stream HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:50031 - "OPTIONS /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:18150 - "OPTIONS /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:50031 - "POST /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:57438 - "POST /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | 192.168.69.1:62128 - "POST /threads/00000000-0000-0000-0000-000000000000/runs/stream HTTP/1.1" 200
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Attempting to use the httpx context functionality
langgraph-api-1       | HTTP Request: GET https://www.langchain.com "HTTP/1.1 200 OK"
langgraph-api-1       | <Response [200 OK]>
langgraph-api-1       | .../python3.12/site-packages/pydantic/v1/main.py:979: RuntimeWarning: __slots__ should not be passed to create_model
langgraph-api-1       |   warnings.warn('__slots__ should not be passed to create_model', RuntimeWarning)
langgraph-api-1       | 192.168.69.1:57438 - "GET /assistants/ASS1STNT-0000-0000-0000-000000000000/schemas HTTP/1.1" 200
langgraph-api-1       | HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
langgraph-api-1       | GETTING: messages
langgraph-api-1       | Attempting to use the httpx context functionality
langgraph-api-1       | HTTP Request: GET https://www.langchain.com "HTTP/1.1 200 OK"
langgraph-api-1       | <Response [200 OK]>
langgraph-api-1       | .../python3.12/site-packages/pydantic/v1/main.py:979: RuntimeWarning: __slots__ should not be passed to create_model
langgraph-api-1       |   warnings.warn('__slots__ should not be passed to create_model', RuntimeWarning)
langgraph-api-1       | 192.168.69.1:29360 - "GET /assistants/ASS1STNT-0000-0000-0000-000000000000/schemas HTTP/1.1" 200
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | 192.168.69.1:56101 - "POST /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | Yielding Httpx Context
langgraph-api-1       | Closing Httpx Session
langgraph-api-1       | Finished closing sessions
langgraph-api-1       | 192.168.69.1:23002 - "POST /threads/00000000-0000-0000-0000-000000000000/history HTTP/1.1" 200
langgraph-api-1       | .../python3.12/site-packages/pydantic/v1/main.py:979: RuntimeWarning: __slots__ should not be passed to create_model
langgraph-api-1       |   warnings.warn('__slots__ should not be passed to create_model', RuntimeWarning)
langgraph-api-1       | 192.168.69.1:23002 - "GET /assistants/ASS1STNT-0000-0000-0000-000000000000/schemas HTTP/1.1" 200
nfcampos commented 2 months ago

Hi, this was due to the context channels being initialized when getting the current state or state history of a thread (which the studio UI does), this is actually unnecessary so I've disabled it in this pr https://github.com/langchain-ai/langgraph/pull/1309

dqbd commented 1 month ago

Should be already resolved with the newest API image, feel free to reopen if the issue still persists!