langchain-ai / langgraph

Build resilient language agents as graphs.
https://langchain-ai.github.io/langgraph/
MIT License
5.46k stars 866 forks source link

How to Pass thread_id to langserve chat playground for langgraph? #1020

Closed Arslan-Mehmood1 closed 2 days ago

Arslan-Mehmood1 commented 1 month ago

Checked other resources

Example Code

app = FastAPI(
    title="Biscuit AI LangChain Server",
    version="1.0",
    description="Api server using LangChain's Runnable interfaces",
)

class InputChat(BaseModel):
    """Input for the chat endpoint."""

    messages: List[Union[HumanMessage, AIMessage, SystemMessage]] = Field(
        ...,
        description="The chat messages representing the current conversation.",
    )

    input: str

def output_parsing_for_playground(agent_output):
    data = agent_output[-1]

    content = next(iter(data.values()))['messages'][0].content

    print("graph output : ",agent_output[-1])
    print("content : ",content)

    return content

add_routes(
    app,
    (graph | RunnableLambda(output_parsing_for_playground)).with_types(input_type=InputChat, output_type=str),
    playground_type="chat"
)

if __name__ == "__main__":
    import uvicorn
    host = "localhost"
    port = 8181

    uvicorn.run(app, host=host, port=port)

Error Message and Stack Trace (if applicable)

----------------------------------------------------------------------------------------------------
INFO:     Started server process [96599]
INFO:     Waiting for application startup.

 __          ___      .__   __.   _______      _______. _______ .______     ____    ____  _______
|  |        /   \     |  \ |  |  /  _____|    /       ||   ____||   _  \    \   \  /   / |   ____|
|  |       /  ^  \    |   \|  | |  |  __     |   (----`|  |__   |  |_)  |    \   \/   /  |  |__
|  |      /  /_\  \   |  . `  | |  | |_ |     \   \    |   __|  |      /      \      /   |   __|
|  `----./  _____  \  |  |\   | |  |__| | .----)   |   |  |____ |  |\  \----.  \    /    |  |____
|_______/__/     \__\ |__| \__|  \______| |_______/    |_______|| _| `._____|   \__/     |_______|

LANGSERVE: Playground for chain "/" is live at:
LANGSERVE:  │
LANGSERVE:  └──> /playground/
LANGSERVE:
LANGSERVE: See all available routes at /docs/

LANGSERVE: ⚠️ Using pydantic 2.8.2. OpenAPI docs for invoke, batch, stream, stream_log endpoints will not be generated. API endpoints and playground should work as expected. If you need to see the docs, you can downgrade to pydantic 1. For example, `pip install pydantic==1.10.13`. See https://github.com/tiangolo/fastapi/issues/10360 for details.

INFO:     Application startup complete.
INFO:     Uvicorn running on http://localhost:8181 (Press CTRL+C to quit)
INFO:     127.0.0.1:50628 - "GET /playground/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:50628 - "GET /playground/assets/index-86d4d9c0.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:50632 - "GET /playground/assets/index-434ff580.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:50632 - "GET /c/N4XyA/input_schema HTTP/1.1" 200 OK
INFO:     127.0.0.1:50628 - "GET /c/N4XyA/output_schema HTTP/1.1" 200 OK
INFO:     127.0.0.1:50632 - "GET /playground/favicon.ico HTTP/1.1" 200 OK
INFO:     127.0.0.1:50628 - "POST /stream_log HTTP/1.1" 200 OK
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 282, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 271, in wrap
    await func()
  File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 221, in listen_for_disconnect
    message = await receive()
  File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 553, in receive
    await self.message_event.wait()
  File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 71986c155330

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
  |     raise exc
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
  |     await route.handle(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
  |     await self.app(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
  |     await response(scope, receive, send)
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 268, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 271, in wrap
    |     await func()
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/sse_starlette/sse.py", line 251, in stream_response
    |     async for data in self.body_iterator:
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langserve/api_handler.py", line 1214, in _stream_log
    |     async for chunk in self._runnable.astream_log(
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 969, in astream_log
    |     async for item in _astream_log_implementation(  # type: ignore
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/tracers/log_stream.py", line 635, in _astream_log_implementation
    |     await task
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/tracers/log_stream.py", line 589, in consume_astream
    |     async for chunk in runnable.astream(input, config, **kwargs):
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 5161, in astream
    |     async for item in self.bound.astream(
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3199, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3182, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2114, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/tracers/log_stream.py", line 239, in tap_output_aiter
    |     async for chunk in output:
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3152, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4585, in atransform
    |     async for output in self._atransform_stream_with_config(
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2067, in _atransform_stream_with_config
    |     final_input: Optional[Input] = await py_anext(input_for_tracing, None)
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 66, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 101, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1281, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/arslan/.virtualenvs/biscuit_ai_langgraph/lib/python3.10/site-packages/langgraph/pregel/__init__.py", line 1297, in astream
    |     raise ValueError(
    | ValueError: Checkpointer requires one or more of the following 'configurable' keys: ['thread_id', 'thread_ts']

Description

.

System Info

.

hinthornw commented 1 month ago

Have you tried passing via the config?

my_runnable = RemoteRunnable(...)

my_runnable.invoke(..., config={"configurable": {"thread_id": ...}})
Arslan-Mehmood1 commented 1 month ago

Have you tried passing via the config?

my_runnable = RemoteRunnable(...)

my_runnable.invoke(..., config={"configurable": {"thread_id": ...}})

Yes it works that way. But I wanna use langserve chat playground.

damianoneill commented 6 days ago

Is there any update on this ticket that shows how to use with langserve?

hinthornw commented 2 days ago

Have you added the appropraite configuration? examplefor it to show in the langserve chat playground?

In general, we recommend using langgraph studio at the moment for a better experience.

We are working to extend support of the langgraph studio to Windows and Linux users soon. It is better suited for langgraph applications.