nebari-dev / nebari

🪴 Nebari - your open source data science platform
https://nebari.dev
BSD 3-Clause "New" or "Revised" License
279 stars 91 forks source link

Compare Various Workflow Automation Solutions for potential integration into Nebari #1098

Closed Adam-D-Lewis closed 1 year ago

Adam-D-Lewis commented 2 years ago

Feature description

Currently long running computations require a browser window to be kept open for the duration of the computation. Some prototype work has been done to enable ‘background’ and ‘batch’ processes to be run on QHub.

Value and/or benefit

This feature would enhance this and make it easily accessible to scientists and engineers using the platform.

Anything else?

No response

Adam-D-Lewis commented 2 years ago

There are a few possible options that I've considered: jupyterhub-ssh, kbatch, or just let whatever solution we come up with https://github.com/Quansight/qhub/issues/1100 and/or https://github.com/Quansight/qhub/issues/1099 handle this as well.

Kbatch and jupyterhub-ssh could both solve this potentially. Kbatch has the advantage that it can run on custom docker images, allowing users access to dependencies that aren't available through conda, but the user does not have access to all the jupyter user's files. Instead, kbatch has a limitation which only allows you to pass in a single file or directory only up to 1 MiB in size (uses ConfigMap under the hood).

Jupyterhub-ssh is simpler, works with no additional dependencies, and allows access to all of the jupyter user's files by default so seems preferred to me. Both Kbatch and jupyterhub-ssh can allow users access to the conda envs in conda-store, and both could run notebooks via papermill. Neither option currently allows the user to choose what instance size they run on, and it's not clear to me whether users could still use dask-gateway with Kbatch (maybe a permissions issue?) since Kbatch runs the job as a separate pod. It's still not clear to me if the ssh-jupyter sessions would be closed after the job was finished when using jupyterhub-ssh, so that maybe something to look into still.

How jupyterhub-ssh would work currently

The above isn't too complex, but it might be nice to wrap that in a thin CLI tool similar to kbatch's cli tool. I'm not particular on the name, but let's say it's "qrunner" for the sake of this example. The user could pip install qrunner, then do something like

How kbatch would work currently

Ideal Solution Attributes

Regardless of what solution we use, I believe the ideal solution would have the following attributes:

Adam-D-Lewis commented 2 years ago

Other options to consider:

Adam-D-Lewis commented 2 years ago

Yason

Adam-D-Lewis commented 2 years ago

Kedro

Kedro Features

It seems like Kedro could technically act as a workflow manager, but it's very data science use case focused, and using it as a general purpose workflow engine would likely require us to shoehorn our needs into their existing structure leading to a bad user experience. I'd see Kedro as useful during data science projects, but not as a general workflow manager.

Adam-D-Lewis commented 2 years ago

Jupyterflow

CLI tool that will launch an environment similar to jupyter user pod via an Argo workflow Can specify simple dependencies (a bit clunky, but works) Seems stagnant for a year (last commit Mar 1, 2021) Can schedule workflows via cron syntax in workflow file Has option to override cpu, memory, nodeSelector, etc. Uses same image as jupyteruser by default, so we'd need to override either:

Jupyterflow Example Usage

# workflow.yaml
jobs:
- conda run -n myenv papermill input.ipynb              # 1
- conda run -n myenv python train.py softmax 0.5        # 2
- conda run -n myenv python train.py softmax 0.9        # 3
- conda run -n myenv python train.py relu 0.5           # 4
- conda run -n myenv python train.py relu 0.9           # 5
- conda run -n myenv python output.py                   # 6

# Job index starts at 1.
dags:
- 1 >> 2
- 1 >> 3
- 1 >> 4
- 1 >> 5
- 2 >> 6
- 3 >> 6
- 4 >> 6
- 5 >> 6

then jupyterflow run -f workflow.yaml

I like jupyterflow for it's simplicity. It seems to make some reasonable assumptions (image to use, volumes to mount) which make it easy for users not familiar with Kubernetes to define and run workflows. We could likely add the functionality to either launch in the conda-store image by default or use conda run -n without the user needing to specify it. We could also add the ability to transfer over env vars to the workflow by default as well. It also supports scheduling of workflows (cron). However, more complex workflows may require a different tool. I'm also not familiar with what the reporting capabilities of Argo Workflows look like which is the only reporting/observability solution (by default) for this.

Perhaps, creating some way to make similar assumptions, but using a more fully featured tool could also be an option if preferred over juptyerflow.

Adam-D-Lewis commented 2 years ago

Hera / Couler

iameskild commented 2 years ago

Argo Workflow

I played around with Argo Workflow today and got a few sample workflows to run using the argo CLI. This was fairly trivial once you have a Kubernetes cluster up and running (I was doing so on QHub deployed on Minikube).

Working with Argo Workflow requires an argo-server running on the cluster (installed from a kubectl apply command), and then to interact with it, you'll need the aforementioned argo CLI. Argo does seem to have a argo-helm repo which might be useful if/when we want to integrate it into QHub.

From skimming the docs for many of the tools listed above, it seems like many of them either require or will play nicely with Argo Worklow.

The gap that exists with Argo Workflow is how to enable users to launch these workflows from JupyterLab. Yason or Jupyterflow might be possible solutions. My main concern around these two tools is that they both seem to be maintained by individuals.

In the same vain as Hera, Argo Workflow seems to have an existing Python SDK.

Adam-D-Lewis commented 2 years ago

I'm curious to learn more about the visualizations/reporting in Argo Workflows. I'm also not clear on how authentication/authorization would work. Maybe we don't need to worry about authentication/authorization just yet though.

dharhas commented 2 years ago

So it sounds like Argo is a strong contender for the base layer of our integrated workflow solution and then on top of it we could potentially have multiple client side tools leveraging it.

@trallard

trallard commented 2 years ago

Argo is a really versatile orchestrator engine - not only it integrates well with other pipeline/Ml tools but opens up loads of possibilities for CI driven ML workflows. I think is it a good bet in terms of flexibility and extensibility for Qhub and its users

trallard commented 2 years ago

@dharhas @Adam-D-Lewis are we planning to explore more options?

dharhas commented 2 years ago

Well, I don't think we need to explore more options per se but the current integrations are not fully complete. i.e.

  1. kbatch - this is integrated and works but requires specifying a docker image and also does not have the user volumes mounted so it isn't very straightforward to use
  2. argo workflows - this is integrated on the backend but we do not yet understand the best way to use it from user space (i.e python or jupyter)

The above should probably opened as new issues and this can be closed.

iameskild commented 1 year ago

Argo-Workflows has been integrated. This can be closed 🎉