google / caliban

Research workflows made easy, locally and in the Cloud.
https://caliban.readthedocs.io
Apache License 2.0
494 stars 67 forks source link

Bump mlflow from 1.10.0 to 2.13.2 in /tutorials/uv-metrics in the pip group across 1 directory #131

Open dependabot[bot] opened 5 months ago

dependabot[bot] commented 5 months ago

Bumps the pip group with 1 update in the /tutorials/uv-metrics directory: mlflow.

Updates mlflow from 1.10.0 to 2.13.2

Release notes

Sourced from mlflow's releases.

v2.13.2

MLflow 2.13.2 is a patch release that includes several bug fixes and integration improvements to existing features.

Features:

Bug fixes:

Small bug fixes and documentation updates:

#12268, #12210, @​B-Step62; #12214, @​harupy; #12223, #12226, @​annzhang-db; #12260, #12237, @​prithvikannan; #12261, @​BenWilson2; #12231, @​serena-ruan; #12238, @​sunishsheth2009

v2.13.1

MLflow 2.13.1 is a patch release that includes several bug fixes and integration improvements to existing features. New features that are introduced in this patch release are intended to provide a foundation to further major features that will be released in the next release.

Features:

  • [MLflow] Add mlflow[langchain] extra that installs recommended versions of langchain with MLflow (#12182, @​sunishsheth2009)
  • [Tracking] Adding the ability to override the model_config in langchain flavor if loaded as pyfunc (#12085, @​sunishsheth2009)
  • [Model Registry] Automatically detect if Presigned URLs are required for Unity Catalog (#12177, @​artjen)

Bug fixes:

  • [Tracking] Use getUserLocalTempDir and getUserNFSTempDir to replace getReplLocalTempDir and getReplNFSTempDir in databricks runtime (#12105, @​WeichenXu123)
  • [Model] Updating chat model to take default input_example and predict to accept json during inference (#12115, @​sunishsheth2009)
  • [Tracking] Automatically call load_context when inferring signature in pyfunc (#12099, @​sunishsheth2009)

v2.13.0

MLflow 2.13.0 includes several major features and improvements

With this release, we're happy to introduce several features that enhance the usability of MLflow broadly across a range of use cases.

Major Features and Improvements:

  • Streamable Python Models: The newly introduced predict_stream API for Python Models allows for custom model implementations that support the return of a generator object, permitting full customization for GenAI applications.

  • Enhanced Code Dependency Inference: A new feature for automatically inferrring code dependencies based on detected dependencies within a model's implementation. As a supplement to the code_paths parameter, the introduced infer_model_code_paths option when logging a model will determine which additional code modules are needed in order to ensure that your models can be loaded in isolation, deployed, and reliably stored.

  • Standardization of MLflow Deployment Server: Outputs from the Deployment Server's endpoints now conform to OpenAI's interfaces to provide a simpler integration with commonly used services.

Features:

  • [Deployments] Update the MLflow Deployment Server interfaces to be OpenAI compatible (#12003, @​harupy)
  • [Deployments] Add Togetherai as a supported provider for the MLflow Deployments Server (#11557, @​FotiosBistas)
  • [Models] Add predict_stream API support for Python Models (#11791, @​WeichenXu123)

... (truncated)

Changelog

Sourced from mlflow's changelog.

2.13.2 (2024-06-06)

MLflow 2.13.2 is a patch release that includes several bug fixes and integration improvements to existing features.

Features:

Bug fixes:

Small bug fixes and documentation updates:

#12268, #12210, @​B-Step62; #12214, @​harupy; #12223, #12226, @​annzhang-db; #12260, #12237, @​prithvikannan; #12261, @​BenWilson2; #12231, @​serena-ruan; #12238, @​sunishsheth2009

2.13.1 (2024-05-30)

MLflow 2.13.1 is a patch release that includes several bug fixes and integration improvements to existing features. New features that are introduced in this patch release are intended to provide a foundation to further major features that will be released in the next release.

Features:

  • [MLflow] Add mlflow[langchain] extra that installs recommended versions of langchain with MLflow (#12182, @​sunishsheth2009)
  • [Tracking] Adding the ability to override the model_config in langchain flavor if loaded as pyfunc (#12085, @​sunishsheth2009)
  • [Model Registry] Automatically detect if Presigned URLs are required for Unity Catalog (#12177, @​artjen)

Bug fixes:

  • [Tracking] Use getUserLocalTempDir and getUserNFSTempDir to replace getReplLocalTempDir and getReplNFSTempDir in databricks runtime (#12105, @​WeichenXu123)
  • [Model] Updating chat model to take default input_example and predict to accept json during inference (#12115, @​sunishsheth2009)
  • [Tracking] Automatically call load_context when inferring signature in pyfunc (#12099, @​sunishsheth2009)

Small bug fixes and documentation updates:

#12180, #12152, #12128, #12126, #12100, #12086, #12084, #12079, #12071, #12067, #12062, @​serena-ruan; #12175, #12167, #12137, #12134, #12127, #12123, #12111, #12109, #12078, #12080, #12064, @​B-Step62; #12142, @​2maz; #12171, #12168, #12159, #12153, #12144, #12104, #12095, #12083, @​harupy; #12160, @​aravind-segu; #11990, @​kriscon-db; #12178, #12176, #12090, #12036, @​sunishsheth2009; #12162, #12110, #12088, #11937, #12075, @​daniellok-db; #12133, #12131, @​prithvikannan; #12132, #12035, @​annzhang-db; #12121, #12120, @​liangz1; #12122, #12094, @​dbczumar; #12098, #12055, @​mparkhe

2.13.0 (2024-05-20)

MLflow 2.13.0 includes several major features and improvements

With this release, we're happy to introduce several features that enhance the usability of MLflow broadly across a range of use cases.

Major Features and Improvements:

  • Streamable Python Models: The newly introduced predict_stream API for Python Models allows for custom model implementations that support the return of a generator object, permitting full customization for GenAI applications.

  • Enhanced Code Dependency Inference: A new feature for automatically inferrring code dependencies based on detected dependencies within a model's implementation. As a supplement to the code_paths parameter, the introduced infer_model_code_paths option when logging a model will determine which additional code modules are needed in order to ensure that your models can be loaded in isolation, deployed, and reliably stored.

... (truncated)

Commits
  • de95337 Run python3 dev/update_mlflow_versions.py pre-release ... (#12270)
  • e14c7cf Suppress trace display while loading model-as-code langchain model that inclu...
  • 96e51a7 Avoid importing mlflow.gateway at the top level of mlflow.deployment modu...
  • 80c0b6e Silence traces when logging langchain models (#12210)
  • 231740f Call _flatten_nested_params only when model_config is truthy (#12214)
  • 0e4c4f5 Use unique temp dir for model code (#12223)
  • f097aff Rename environment variables to not include "DATABRICKS" (#12260)
  • b79c425 Add alternative package names for RAG to requirements exclusion validation (#...
  • 68db950 Mock dbutils when loading model code path (#12226)
  • fd8fb05 Ignore databricks_rag_studio in package mismatch (#12231)
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/google/caliban/network/alerts).