mlflow / mlflow-torchserve

Plugin for deploying MLflow models to TorchServe
Apache License 2.0
106 stars 22 forks source link

[FR] Compatibility with MLflow 2.0 #96

Open dbczumar opened 1 year ago

dbczumar commented 1 year ago

Thank you for submitting a feature request. Before proceeding, please review MLflow's Issue Policy for feature requests and the MLflow Contributing Guide.

Please fill in this feature request template to ensure a timely and thorough response.

Willingness to contribute

The MLflow Community encourages new feature contributions. Would you or another member of your organization be willing to contribute an implementation of this feature (as an enhancement to the MLflow TorchServe Deployment plugin code base)?

Proposal Summary

In MLflow 2.0 (scheduled for release on Nov. 14), we will be making small modifications to the MLflow Model Server's RESTful scoring protocol (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/models.html#deploy-mlflow-models) and the MLflow Deployment Client predict() API (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/python_api/mlflow.deployments.html#mlflow.deployments.BaseDeploymentClient.predict).

For compatibility with MLflow 2.0, the mlflow-torchserve plugin will need to be updated to conform to the new scoring protocol and Deployment Client interface. The MLflow maintainers are happy to assist with this process, and we apologize for the short notice.

Motivation

What component(s) does this feature affect?

Components

dbczumar commented 1 year ago

@shrinath-suresh @chauhang Do you have bandwidth on your end to migrate the mlflow-torchserve plugin to the updated scoring protocol and adjust the Deployment Client predict() API? By my estimates, it should only take a few hours of work at most. Apologies for the short notice.

shrinath-suresh commented 1 year ago

@dbczumar I am testing the changes with MLflow 2.0.

From MLflow 2.0, is it mandatory to return predict response of type pd.DataFrame or pd.Series ? . Getting error while converting text/json due to result.to_json() convertion done here

Till MLflow 1.30.0, we are returning the result as a string (code link) and mlflow cli converts it to json (code link).

Most of the ML examples (bert, mnist, iris classification), the output of the model will be either json or text. If the changes are intentional, we need to change the mlflow_torchserve plugin's predict method to convert the user response to dataframe and return it to mlflow cli

shrinath-suresh commented 1 year ago

@dbczumar @harupy

dbczumar commented 1 year ago

@shrinath-suresh So sorry for the delay here. The deployment client predict() method always needs to return a PredictionsResponse object (https://mlflow.org/docs/latest/python_api/mlflow.deployments.html#mlflow.deployments.PredictionsResponse). For your use case, feel free to subclass PredictionsResponse and define to_json() as desired :)

shrinath-suresh commented 1 year ago

@dbczumar Can you please review this PR when you find time ? https://github.com/mlflow/mlflow-torchserve/pull/97

dbczumar commented 1 year ago

@shrinath-suresh Approved! Happy New Year!

Habardeen commented 1 year ago

Has this been merged/solved and we are good to go with the latest version? Looks like it by #97 but just wanted to confirm if there/s anything outstanding since Issue is still open

bhack commented 10 months ago

Is this plugin still maintained?