Closed umithyo closed 1 month ago
Hi @umithyo great q - as you pointed out, thread_ts
isn't exposed/really honored as a user-configurable value right now - could you share more about what you plan to gain by managing it outside the system?
If we were to implement this, it would likely only apply to the first checkpoint and then we'd overwrite it in subsequent ones within a trace anyhow
Hi @umithyo great q - as you pointed out,
thread_ts
isn't exposed as a user-configurable value right now - could you share more about what you plan to gain by managing it outside the system?If we were to implement this, it would likely only apply to the first checkpoint and then we'd overwrite it in subsequent ones within a trace anyhow
Hi @hinthornw
Thank you. I feel like it's being represented wrong is all. While it's shown as a datetime value in StateSnapshot that is not the case at all. Also, last message is being fetched by this field although it is a uuid. Maybe I haven't gotten my head around this.
I have no use case at the moment but this just caught my eye while going through the sqlite.py file.
Hi! @hinthornw Actually I have some use case for the thread_ts. Maybe it could be slightly off-topic. When I tried to use LangServe endpoint with a graph built by LangGraph, I'd like to use TimeTravel feature from client. For now, howerver, I don't know what will be the best way to deliver thread_ts from StateSnapShot to a client of LangServe.
FYI, thread_ts is opened to client because it is included in configurable
Hi @jty016 , for time travel in a deployed version, we're also working on https://github.com/langchain-ai/langgraph-example , which would add better first-class support for this kind of thing. Still in alpha right now but hoping to get something ready by EOM
Thanks for clarifying, @umithyo ! Will sync with nuno on this before I give a full response. Appreciate both of your patience
Hi @jty016 , for time travel in a deployed version, we're also working on https://github.com/langchain-ai/langgraph-example , which would add better first-class support for this kind of thing. Still in alpha right now but hoping to get something ready by EOM
This is very interesting and the way I really like to
however I think langgraph cli
and langgraph sdk
repo is not ready or opened.
@jty016 Hi, I'm having issue with using langgraph with langserve playground.
I'm stuck and don't know how to pass thread_id required by checkpointer in langgraph.
current_code after trying multiple things:
### LANGSERVE PLAYGROUND
from typing import List, Union
from fastapi import FastAPI
from langserve import add_routes
from langserve.pydantic_v1 import BaseModel, Field
from langchain_core.runnables import RunnableLambda, RunnablePassthrough
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
from langchain_core.runnables import ConfigurableField
from langchain_core.runnables import RunnableConfig
app = FastAPI(
title="Biscuit AI LangChain Server",
version="1.0",
description="Api server using LangChain's Runnable interfaces",
)
class InputChat(BaseModel):
"""Input for the chat endpoint."""
messages: List[Union[HumanMessage, AIMessage, SystemMessage]] = Field(
...,
description="The chat messages representing the current conversation.",
)
input: str
# thread_id:int
def output_parsing_for_playground(agent_output):
data = agent_output[-1]
content = next(iter(data.values()))['messages'][0].content
print("graph output : ",agent_output[-1])
print("content : ",content)
return content
# def execute_graph(input,thread_id):
# input_message = [HumanMessage(content=input)]
# config = {"configurable": {"thread_id": thread_id}}
# return graph.stream({"messages": input_message}, config, stream_mode="values")
add_routes(
app,
(
# RunnablePassthrough.assign(thread_id=lambda x: x["thread_id"])
graph.with_config(RunnableConfig(configurable={"configurable": {"thread_id": thread_id}}))
# | execute_graph
| RunnableLambda(output_parsing_for_playground)).with_types(input_type=InputChat, output_type=str
),
config_keys=["configurable"],
playground_type="default"
)
if __name__ == "__main__":
import uvicorn
host = "localhost"
port = 8181
uvicorn.run(app, host=host, port=port)
error:
| ValueError: Checkpointer requires one or more of the following 'configurable' keys: ['thread_id', 'thread_ts']
thread_ts has been renamed to checkpoint_id in 0.2
@hinthornw any solution for above mentioned issue, on using LangGraph with checkpointer with LangServe.
@Arslan-Mehmood1 I can help you
Checked other resources
Example Code
The following code:
Following
graph.invoke()
, stream method sets thethread_ts
tocheckpoint["id"]
instead of the passedthread_ts
inconfig
Error Message and Stack Trace (if applicable)
No response
Description
I'm trying to create a persistent storage for LangGraph's messages. While trying it out, I noticed that
thread_ts
argument that I pass gets changed tocheckpoint["id"]
by streamAm I missing something or is this something to be fixed? If so, I'd like to create a PR for this.
System Info
langchain==0.2.1 langchain-community==0.2.1 langchain-core==0.2.1 langchain-mongodb==0.1.5 langchain-openai==0.1.7 langchain-pinecone==0.1.1 langchain-text-splitters==0.2.0