langchain-ai / langsmith-sdk

LangSmith Client SDK Implementations
https://smith.langchain.com/
MIT License
346 stars 59 forks source link

If a chain has a ConfigurableField and the chain is invoked through .with_config(...), then the config should be logged in LangSmith #809

Open codekiln opened 1 week ago

codekiln commented 1 week ago

Feature request

If a chain has a ConfigurableField and the chain is invoked through RemoteRunnable.with_config(...), then the config should be logged in LangSmith. As far as I can tell, it is not always automatically logged.

Motivation

We have a prompt management UI that allows us to tune parameters of the prompt, which are then sent to LangServe via RemoteRunnable invocation using .with_config(). One of the goals of LangSmith logging for us is to identify the parameters correlated with a good run. For example, the search_kwargs of our VectorStoreRetriever are not getting logged automatically in any place I can see. This even happens when running the retriever.with_config(...).invoke;

image

(trace id: e2853ba1-835d-4842-ac49-6eabd222a2e1)

image

Right now is the assumption that we need to add all configurable fields through custom logging? If so, do you have a recommended way we could do that?

hinthornw commented 1 week ago

Configurable values should be auto-added as metadata - could you elaborate on this line retriever.with_config(...).invoke() - what are you putting in with_config?

You can also explicitly add metadata in with_config if you'd like

codekiln commented 1 week ago

what are you putting in with_config?

@hinthornw For example, we call our chains via a LangServe endpoint using RemoteRunable.with_config(configurable={"prompt_for_agent_a": ..., "prompt_for_agent_b": ..., "search_kwargs_for_retriever": { complex dictionary including k, similarity score cutoff, filters, etc }, ... }). We're storing the configurable dictionary in stateful persistence so that we can compare these hyperparameters across different experiments.

You can also explicitly add metadata in with_config if you'd like

For now maybe I'll try adding the entire dictionary sent to configurable to metadata as well, but it seems like that isn't very DRY. I imagine that we're not the only one who needs to see the "runtime" configuration of a given chain in the traces. Also, based on my experimentation it seems like LangServe doesn't pass through all of the metadata: https://github.com/langchain-ai/langserve/issues/694