In MLflow 2.10, we're introducing a number of significant new features that are preparing the way for current and future enhanced support for Deep Learning use cases, new features to support a broadened support for GenAI applications, and some quality of life improvements for the MLflow Deployments Server (formerly the AI Gateway).
New MLflow Website
We have a new home. The new site landing page is fresh, modern, and contains more content than ever. We're adding new content and blogs all of the time.
Objects and Arrays are now available as configurable input and output schema elements. These new types are particularly useful for GenAI-focused flavors that can have complex input and output types. See the new Signature and Input Example documentation to learn more about how to use these new signature types.
LangChain has autologging support now! When you invoke a chain, with autologging enabled, we will automatically log most chain implementations, recording and storing your configured LLM application for you. See the new Langchain documentation to learn more about how to use this feature.
The MLflow transformers flavor now supports prompt templates. You can now specify an application-specific set of instructions to submit to your GenAI pipeline in order to simplify, streamline, and integrate sets of system prompts to be supplied with each input request. Check out the updated guide to transformers to learn more and see examples!
The MLflow Deployments Server now supports two new requested features: (1) OpenAI endpoints that support streaming responses. You can now configure an endpoint to return realtime responses for Chat and Completions instead of waiting for the entire text contents to be completed. (2) Rate limits can now be set per endpoint in order to help control cost overrun when using SaaS models.
Further Document Improvements
Continued the push for enhanced documentation, guides, tutorials, and examples by expanding on core MLflow functionality (Deployments, Signatures, and Model Dependency management), as well as entirely new pages for GenAI flavors. Check them out today!
Other Features:
[Models] Enhance the MLflow Models predict API to serve as a pre-logging validator of environment compatibility. (#10759, @B-Step62)
[Models] Add support for Image Classification pipelines within the transformers flavor (#10538, @KonakanchiSwathi)
[Models] Add support for retrieving and storing license files for transformers models (#10871, @BenWilson2)
[Models] Add support for model serialization in the Visual NLP format for JohnSnowLabs flavor (#10603, @C-K-Loan)
[Models] Automatically convert OpenAI input messages to LangChain chat messages for pyfunc predict (#10758, @dbczumar)
[Tracking] Enhance async logging functionality by ensuring flush is called on Futures objects (#10715, @chenmoneygithub)
[Tracking] Add support for a non-interactive mode for the login() API (#10623, @henxing)
[Scoring] Allow MLflow model serving to support direct dict inputs with the messages key (#10742, @daniellok-db, @B-Step62)
[Deployments] Add streaming support to the MLflow Deployments Server for OpenAI streaming return compatible routes (#10765, @gabrielfu)
[Deployments] Add support for directly interfacing with OpenAI via the MLflow Deployments server (#10473, @prithvikannan)
[UI] Introduce a number of new features for the MLflow UI (#10864, @daniellok-db)
[Server-infra] Add an environment variable that can disallow HTTP redirects (#10655, @daniellok-db)
[Artifacts] Add support for Multipart Upload for Azure Blob Storage (#10531, @gabrielfu)
Bug fixes
[Models] Add deduplication logic for pip requirements and extras handling for MLflow models (#10778, @BenWilson2)
[Tracking] Fix an issue with an incorrect retry default timeout for urllib3 1.x (#10839, @BenWilson2)
[Recipes] Fix an issue with MLflow Recipes card display format (#10893, @WeichenXu123)
[Java] Fix an issue with metadata collection when using Streaming Sources on certain versions of Spark where Delta is the source (#10729, @daniellok-db)
MLflow 2.10.0 includes several major features and improvements
In MLflow 2.10, we're introducing a number of significant new features that are preparing the way for current and future enhanced support for Deep Learning use cases, new features to support a broadened support for GenAI applications, and some quality of life improvements for the MLflow Deployments Server (formerly the AI Gateway).
Our biggest features this release are:
We have a new home. The new site landing page is fresh, modern, and contains more content than ever. We're adding new content and blogs all of the time.
Objects and Arrays are now available as configurable input and output schema elements. These new types are particularly useful for GenAI-focused flavors that can have complex input and output types. See the new Signature and Input Example documentation to learn more about how to use these new signature types.
LangChain has autologging support now! When you invoke a chain, with autologging enabled, we will automatically log most chain implementations, recording and storing your configured LLM application for you. See the new Langchain documentation to learn more about how to use this feature.
The MLflow transformers flavor now supports prompt templates. You can now specify an application-specific set of instructions to submit to your GenAI pipeline in order to simplify, streamline, and integrate sets of system prompts to be supplied with each input request. Check out the updated guide to transformers to learn more and see examples!
The MLflow Deployments Server now supports two new requested features: (1) OpenAI endpoints that support streaming responses. You can now configure an endpoint to return realtime responses for Chat and Completions instead of waiting for the entire text contents to be completed. (2) Rate limits can now be set per endpoint in order to help control cost overrun when using SaaS models.
Continued the push for enhanced documentation, guides, tutorials, and examples by expanding on core MLflow functionality (Deployments, Signatures, and Model Dependency management), as well as entirely new pages for GenAI flavors. Check them out today!
Features:
[Models] Introduce Objects and Arrays support for model signatures (#9936, @serena-ruan)
[Models] Support saving prompt templates for transformers (#10791, @daniellok-db)
[Models] Enhance the MLflow Models predict API to serve as a pre-logging validator of environment compatibility. (#10759, @B-Step62)
[Models] Add support for Image Classification pipelines within the transformers flavor (#10538, @KonakanchiSwathi)
[Models] Add support for retrieving and storing license files for transformers models (#10871, @BenWilson2)
[Models] Add support for model serialization in the Visual NLP format for JohnSnowLabs flavor (#10603, @C-K-Loan)
[Models] Automatically convert OpenAI input messages to LangChain chat messages for pyfunc predict (#10758, @dbczumar)
[Tracking] Fix an issue with an incorrect retry default timeout for urllib3 1.x (#10839, @BenWilson2)
[Recipes] Fix an issue with MLflow Recipes card display format (#10893, @WeichenXu123)
[Java] Fix an issue with metadata collection when using Streaming Sources on certain versions of Spark where Delta is the source (#10729, @daniellok-db)
[Scoring] Fix an issue where SageMaker tags were not propagating correctly (#9310, @clarkh-ncino)
[Windows / Databricks] Fix an issue with executing Databricks run commands from within a Window environment (#10811, @wolpl)
[Models / Databricks] Disable mlflowdbfs mounts for JohnSnowLabs flavor due to flakiness (#9872, @C-K-Loan)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/opendp/smartnoise-sdk/network/alerts).
Bumps mlflow from 2.9.2 to 2.10.0.
Release notes
Sourced from mlflow's releases.
... (truncated)
Changelog
Sourced from mlflow's changelog.
... (truncated)
Commits
628fba4
Fix azure openai and docs (#10894)ebb8b2d
Runpython3 dev/update_mlflow_versions.py pre-release ...
(#10909)10d79a7
Runpython3 dev/update_ml_package_versions.py
(#10907)67090c8
Runpython3 dev/update_pypi_package_index.py
(#10905)d1ee9c9
Runpython3 dev/update_requirements.py --requirements-...
(#10906)211fbc7
Fix langchain test (#10901)97e85f9
Revert "Implement promptflow model flavor (#10104)" (#10903)c438237
Fix recipe card display format (#10893)b7c0b77
Fixed theKeyError: 'loss'
bug for the Quickstart guidline (#10886)6e5ec77
Add unit tests for Docker image building and refactor (#10876)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show