bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.13k stars 791 forks source link

feat: mlflow integration #2516

Closed TheisFerre closed 2 years ago

TheisFerre commented 2 years ago

I just saw that you have created a pre-release 1.0.0-rc0, with quite a few changes. As I am quite interested in using bentoml as a framework for serving models, I wanted to test out this new pre-release.

My development environment is Databricks, where I am training models and logging/tracking them with MLFlow. From the updated documentation, I am not quite sure how I can load a model that has been saved in MLFlow as a bentoml model and use it for serving.

Before the latest pre-release, I would be able to load a model in MLFlow as follows:

uri = <databricks_model_uri>
model = bentoml.mlflow.import_from_uri("model", uri)

I noticed that when i use the bentoml.mlflow module, that it tells me to create a custom runner. Do you know when you will support MLFlow models with the new runners implementation or do you have any pointers for implementing this myself?

aarnphm commented 2 years ago

Hi @TheisFerre, Thanks for checking out the new releases. Currently we are working on a integration guide between MLflow and BentoML as BentoML runner concept changes a lot since 1.0.0a7. We will try to get back to you ASAP

TheisFerre commented 2 years ago

@aarnphm sounds great - thanks a lot! I am excited for the 1.0.0 release

ssheng commented 2 years ago

@TheisFerre We plan to include support for MLFlow in the official 1.0 version release. In the next couple of weeks, we will add support for MLFlow framework. Please keep an eye on the next RC releases. Meanwhile, feel free to create a custom runner for your use case if urgent. cc: @parano

TheisFerre commented 2 years ago

Great - i will close this issue, as it is something you guys are planning to support in the near future.

marcindulak commented 2 years ago

If possible, let's keep the issue open until it's implemented.

aarnphm commented 2 years ago

This is currently high triage on our end, we will make sure for official 1.0 MLFlow will be supported.

marcindulak commented 2 years ago

I find it easier to track the development of a project when issues are closed by pull requests, and the corresponding "Subscribe" notifications arrive.

aarnphm commented 2 years ago

The current progress is tracked at https://github.com/bentoml/BentoML/pull/2702

aarnphm commented 2 years ago

Hi @marcindulak, mlflow now has been supported with bentoml 1.0. Feel free to explore the integration at https://github.com/bentoml/gallery/tree/main/mlflow