MLflow 2.13.1 is a patch release that includes several bug fixes and integration improvements to existing features. New features that are introduced in this patch release are intended to provide a foundation to further major features that will be released in the next release.
Features:
[MLflow] Add mlflow[langchain] extra that installs recommended versions of langchain with MLflow (#12182, @sunishsheth2009)
[Tracking] Adding the ability to override the model_config in langchain flavor if loaded as pyfunc (#12085, @sunishsheth2009)
[Model Registry] Automatically detect if Presigned URLs are required for Unity Catalog (#12177, @artjen)
Bug fixes:
[Tracking] Use getUserLocalTempDir and getUserNFSTempDir to replace getReplLocalTempDir and getReplNFSTempDir in databricks runtime (#12105, @WeichenXu123)
[Model] Updating chat model to take default input_example and predict to accept json during inference (#12115, @sunishsheth2009)
[Tracking] Automatically call load_context when inferring signature in pyfunc (#12099, @sunishsheth2009)
v2.13.0
MLflow 2.13.0 includes several major features and improvements
With this release, we're happy to introduce several features that enhance the usability of MLflow broadly across a range of use cases.
Major Features and Improvements:
Streamable Python Models: The newly introduced predict_stream API for Python Models allows for custom model implementations that support the return of a generator object, permitting full customization for GenAI applications.
Enhanced Code Dependency Inference: A new feature for automatically inferrring code dependencies based on detected dependencies within a model's implementation. As a supplement to the code_paths parameter, the introduced infer_model_code_paths option when logging a model will determine which additional code modules are needed in order to ensure that your models can be loaded in isolation, deployed, and reliably stored.
Standardization of MLflow Deployment Server: Outputs from the Deployment Server's endpoints now conform to OpenAI's interfaces to provide a simpler integration with commonly used services.
Features:
[Deployments] Update the MLflow Deployment Server interfaces to be OpenAI compatible (#12003, @harupy)
[Deployments] Add Togetherai as a supported provider for the MLflow Deployments Server (#11557, @FotiosBistas)
[Models] Add predict_stream API support for Python Models (#11791, @WeichenXu123)
MLflow 2.13.1 is a patch release that includes several bug fixes and integration improvements to existing features. New features that are introduced in this patch release are intended to provide a foundation to further major features that will be released in the next release.
Features:
[MLflow] Add mlflow[langchain] extra that installs recommended versions of langchain with MLflow (#12182, @sunishsheth2009)
[Tracking] Adding the ability to override the model_config in langchain flavor if loaded as pyfunc (#12085, @sunishsheth2009)
[Model Registry] Automatically detect if Presigned URLs are required for Unity Catalog (#12177, @artjen)
Bug fixes:
[Tracking] Use getUserLocalTempDir and getUserNFSTempDir to replace getReplLocalTempDir and getReplNFSTempDir in databricks runtime (#12105, @WeichenXu123)
[Model] Updating chat model to take default input_example and predict to accept json during inference (#12115, @sunishsheth2009)
[Tracking] Automatically call load_context when inferring signature in pyfunc (#12099, @sunishsheth2009)
MLflow 2.13.0 includes several major features and improvements
With this release, we're happy to introduce several features that enhance the usability of MLflow broadly across a range of use cases.
Major Features and Improvements:
Streamable Python Models: The newly introduced predict_stream API for Python Models allows for custom model implementations that support the return of a generator object, permitting full customization for GenAI applications.
Enhanced Code Dependency Inference: A new feature for automatically inferrring code dependencies based on detected dependencies within a model's implementation. As a supplement to the code_paths parameter, the introduced infer_model_code_paths option when logging a model will determine which additional code modules are needed in order to ensure that your models can be loaded in isolation, deployed, and reliably stored.
... (truncated)
Commits
de95337 Run python3 dev/update_mlflow_versions.py pre-release ... (#12270)
e14c7cf Suppress trace display while loading model-as-code langchain model that inclu...
96e51a7 Avoid importing mlflow.gateway at the top level of mlflow.deployment modu...
80c0b6e Silence traces when logging langchain models (#12210)
231740f Call _flatten_nested_params only when model_config is truthy (#12214)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/google/caliban/network/alerts).
Bumps the pip group with 1 update in the /tutorials/uv-metrics directory: mlflow.
Updates
mlflow
from 1.10.0 to 2.13.2Release notes
Sourced from mlflow's releases.
... (truncated)
Changelog
Sourced from mlflow's changelog.
... (truncated)
Commits
de95337
Runpython3 dev/update_mlflow_versions.py pre-release ...
(#12270)e14c7cf
Suppress trace display while loading model-as-code langchain model that inclu...96e51a7
Avoid importingmlflow.gateway
at the top level ofmlflow.deployment
modu...80c0b6e
Silence traces when logging langchain models (#12210)231740f
Call_flatten_nested_params
only whenmodel_config
is truthy (#12214)0e4c4f5
Use unique temp dir for model code (#12223)f097aff
Rename environment variables to not include "DATABRICKS" (#12260)b79c425
Add alternative package names for RAG to requirements exclusion validation (#...68db950
Mock dbutils when loading model code path (#12226)fd8fb05
Ignore databricks_rag_studio in package mismatch (#12231)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show