dask / dask-labextension

JupyterLab extension for Dask
BSD 3-Clause "New" or "Revised" License
312 stars 62 forks source link

Cluster auto-discovery and management across environments #82

Open benbovy opened 5 years ago

benbovy commented 5 years ago

As a follow-up on #18 and #31, it would be nice if the "search button" and the cluster management section on the side panel could work across multiple (conda) environments.

Those features work very well when I'm using a single environment where everything (jupyterlab, dask, extensions, etc.) is installed.

However, my (possibly a common?) configuration consists of running jupyterlab from within its own dedicated, lightweight conda environment and using nb_conda_kernels to run kernels installed in other environments (one per project). In this case the search button is unresponsive and the cluster management section in the side panel only manages clusters in the jupyterlab environment. I could still manually copy dashboard addresses in the text field, though (and I'm happy doing this!)

Unfortunately, I have no idea on how much effort this would require to implement.

benbovy commented 5 years ago

I could still manually copy dashboard addresses in the text field, though (and I'm happy doing this!)

This works locally, but unfortunately not on jupyterlab running on a remote server (related to #41 I guess).

hadim commented 4 years ago

I have the same workflow as @benbovy. My base conda env only contains a minimum set of libraries to run JLab and I have one conda env per project.

So it would be nice to be able to create a new cluster "inside" an existing kernel.

Or I am also fine executing the kernel in the notebook if the extension can "discover it".

mangecoeur commented 4 years ago

Similar issue, also related to #41 . I have JLab+Jhub running for a set of users, there is a lightweight env hosting the Hub and Lab interface that users never interact with. Instead there is a shared conda env and each user can also create their own.

It seems the extension can at the moment only start a cluster in the same environment as the Jupyter server, is that correct? It would be great if the "new" button followed a similar logic to the new notebook page in JLab in showing all the available kernels and allowing you to start cluster in whichever you choose.