langchain-ai / langgraph-studio

Desktop app for prototyping and debugging LangGraph applications locally.
https://studio.langchain.com
1.97k stars 131 forks source link

Specify default messages #88

Open austinmw opened 3 months ago

austinmw commented 3 months ago

How can I prepopulate messages (e.g. a system message) so I don't need to paste them into the Studio UI each I test running my graph?

dqbd commented 3 months ago

Hello @austinmw! Assuming you're using the LangGraph Example template, you can update the call_model node to automatically add the system message, like so:

# Define the function that calls the model
def call_model(state, config):
    messages = state["messages"]
    model_name = config.get('configurable', {}).get("model_name", "anthropic")
    model = _get_model(model_name)
    response = model.invoke([SystemMessage("Your system message"), *messages])
    # We return a list, because this will get added to the existing list
    return {"messages": [response]}
austinmw commented 3 months ago

@dqbd Thanks, that works! As an alternative method, does there happen to be a way to pre-populate state information when the LangGraph studio project loads?

The only issue with the above example is that the system message is not retained in the state. Ideally it would be the first state message, which could be useful for logging and other purposes