Closed daniel-maturana closed 21 hours ago
@daniel-maturana Thanks for the FR! It looks like LangGraph compiles the graph into a LangChain Runnable (which mlflow.langchain
supports). Can you use mlflow.langchain.log_model
to log your graph?
Hi Harutaka, yes I’m can log the calls, and trace the model, thats working perfect. But I was actually interested in saving, loading, and deploying models using mlflow.
Maybe is not exactly here, I am trying to use the integration of mlflow in Databricks, to deploy the Agent.
Best,
Daniel
@daniel-maturana can you register the logged model and deploy it as an endpoint? Here's a tutorial that demonstrates how to create custom model serving endpoints:
https://docs.databricks.com/en/machine-learning/model-serving/create-manage-serving-endpoints.html
Hi @harupy , Well yes, it is possible and it iw what I did. I tried to deploy it similarly than langchain, registering the model with
mlflow.models.set_model
But it was not possible because it is not a langchain type.
I finally registered the model as a pyfunc
Is there any big difference between both saving, registering the model?
Thanks
@mlflow/mlflow-team Please assign a maintainer and start triaging this issue.
Hi @harupy , Well yes, it is possible and it iw what I did. I tried to deploy it similarly than langchain, registering the model with
mlflow.models.set_model
But it was not possible because it is not a langchain type.
I finally registered the model as a pyfunc
Is there any big difference between both saving, registering the model?
Thanks
@daniel-maturana did you get it to work for the tracing as well?
@morenoj11 I used the pyfunc model with code to do the deployment in databricks. In my .py file with all the logic, I create a class with the predict and predict_stream methods as well as some accesory classes to do the input to state and state to output for langraph.| Then I registered and deployed in databricks, no problem. When I run it in the notebook, it is working perfect and tracing all the nodes, chains or functions where I added the tracer decorator. I also modified the output to return the SQL queries as well as code, so everything is going.
However now I am stopped because my langraph, which is a SQL and coder agent is taking to long and the serving is giving worker timeout.
I will tryi with another RAG langraph that I alradey have to check if the timeout is langraph problem or is really a timeout. I tried to change several TIMEOUT env variables from mlflow but is not working yet.
Best,
@daniel-maturana thanks for the elaborate response! Would you mind keeping me/this discussion updated on the progress? Also if you could share some sample code it would be great. I'm in a similar situation, and I think I'll go with Langfuse/Langsmith until MLflow catches up, but if there's a workaround I'd be down for it.
Thanks!
Langsmith can capture feedback for experiments. Is there a similar capability in mlflow like langgraph/langchain?
Hi folks! MLflow now support saving LangGraph in native langchain
flavor. Please check it out and let us know if there are any bugs or feedbacks!
https://mlflow.org/docs/latest/llms/langchain/index.html#how-can-i-log-an-agent-built-with-langgraph-to-mlflow
Langsmith can capture feedback for experiments. Is there a similar capability in mlflow like langgraph/langchain?
@mindful-time We haven't supported this but we are planning to add a feature to record human feedback. Please feel free to open a new FR if you have specific use case in your mind. Thanks!
Willingness to contribute
Yes. I can contribute this feature independently.
Proposal Summary
LangGraph use langchain expression language and graph to generate Agents, It is a good solution to multi agent development and for sure it going to increase the use. I currently developed several Agents using Datbaricks data, SQL, retrievals, etc.. But I need to deploy them on Databricks using mlflow.
Motivation
Details
No response
What component(s) does this bug affect?
area/artifacts
: Artifact stores and artifact loggingarea/build
: Build and test infrastructure for MLflowarea/deployments
: MLflow Deployments client APIs, server, and third-party Deployments integrationsarea/docs
: MLflow documentation pagesarea/examples
: Example codearea/model-registry
: Model Registry service, APIs, and the fluent client calls for Model Registryarea/models
: MLmodel format, model serialization/deserialization, flavorsarea/recipes
: Recipes, Recipe APIs, Recipe configs, Recipe Templatesarea/projects
: MLproject format, project running backendsarea/scoring
: MLflow Model server, model deployment tools, Spark UDFsarea/server-infra
: MLflow Tracking server backendarea/tracking
: Tracking Service, tracking client APIs, autologgingWhat interface(s) does this bug affect?
area/uiux
: Front-end, user experience, plotting, JavaScript, JavaScript dev serverarea/docker
: Docker use across MLflow's components, such as MLflow Projects and MLflow Modelsarea/sqlalchemy
: Use of SQLAlchemy in the Tracking Service or Model Registryarea/windows
: Windows supportWhat language(s) does this bug affect?
language/r
: R APIs and clientslanguage/java
: Java APIs and clientslanguage/new
: Proposals for new client languagesWhat integration(s) does this bug affect?
integrations/azure
: Azure and Azure ML integrationsintegrations/sagemaker
: SageMaker integrationsintegrations/databricks
: Databricks integrations