iterative / mlem

🐶 A tool to package, serve, and deploy any ML model on any platform. Archived to be resurrected one day🤞
https://mlem.ai
Apache License 2.0
716 stars 44 forks source link

Investigate options to apply MLEM models within Airflow DAGS #11

Open aguschin opened 3 years ago

aguschin commented 3 years ago

As discussed with @mnrozhkov and @tapadipti, it would be great if MLEM would somehow simplify model application within Airflow DAGs. Two questions to start with:

  1. We can create something like MLEMOperator. What would be it's functionality? How it will help users?
  2. We need to either build virtual environment or Docker image to apply the model in the required environment. Two options to provide those would be either do this in CI or as a task in the same DAG. We need to explore these options and find out how MLEM can simplify work for users here. Note: if you run multiple workers then it may be beneficial to build env in advance. If you have one worker, you may be ok with building it while running MLEMOperator.

Other notes:

  1. Sometimes data is huge and you need to process it in chunks (it may or may not be the case with pyspark. Without pyspark it can be too hard to fit all data in RAM). We need some way to resolve this, e.g. iterate on batches and then compile answer containing predictions from all batches.
  2. Usually, you DAG=processing+scoring. Roughly, in 25% you load data from disk; in other 50% you work with big data (pyspark, Hadoop); in last 25% you work with distributed computing (spark).

DAGs example https://gitlab.com/iterative.ai/cse/use_cases/home_credit_default/-/blob/airflow-dags/dags/scoring.py Showcase of different options: https://gitlab.com/iterative.ai/cse/rnd/deploy-to-airflow

Summary from Mikhail https://iterativeai.slack.com/archives/C0249LW0HAQ/p1631885782026400

aguschin commented 2 years ago

For the reference, there is Airflow extension that adds support for DVC operations: https://github.com/covid-genomics/airflow-dvc

aguschin commented 1 year ago

UPD: we can take a look at other orchestration tools (dagster, prefect, etc.)

aguschin commented 1 year ago

Posting here a short discussion with @mike0sv:

To my mind, we need to get the working demo as simple as possible first. If we can work with mlem apply model hdfs://… - nice, this can be used. If not, downloading data as csv and using mlem apply model data.csv can work as well. @mnrozhkov mentioned they have a project with batch scoring in GitLab, we can start with using MLEM there.

aguschin commented 1 year ago

One issue I found: when you want to build a docker image for Batch Scoring scenario, you absolutely need to specify --server option. Why? Let's either specify fastapi as default so people won't need to confuse themselves about this, or allow to build without any server to skip installing extra dependencies like FastAPI people may not need.

UPD: it was an issue with my local MLEM installation. We don't need to specify --server. By default it's FASTAPI. Maybe good to give an option to skip installing it anyway, but this is not a priority now.

aguschin commented 1 year ago

For reference, a product similar to MLEM that does export to Airflow Pipelines. Need to take a deeper look at it https://docs.lineapy.org/en/latest/guide/build_pipelines/pipeline_basics.html

mike0sv commented 1 year ago

For now we are building docker images only for serving, so server option is required and run command is mlem serve. That can be changed, but for that we first need to understand what it means to build docker for batch scoring