Open jacobtomlinson opened 4 years ago
We do something like this today with the start_ipython_workers
command.
You might want to take a look there.
On Wed, Feb 19, 2020 at 11:48 AM Jacob Tomlinson notifications@github.com wrote:
I've had a thought that I'd like some feedback on before I investigate further.
Would it be possible for Dask to start a kernel on a worker and connect to that from Jupyter Lab? Using an approach like this https://github.com/ipython/ipython/wiki/Cookbook:-Connecting-to-a-remote-kernel-via-ssh .
Ensuring environments are consistent between client and worked can often be frustrating. But running a kernel on a worker may be one workaround/solution to this.
@ian-r-rose https://github.com/ian-r-rose @mrocklin https://github.com/mrocklin
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/dask/dask-labextension/issues/115?email_source=notifications&email_token=AACKZTC4SPWCR7HJACXLNXDRDWEH7A5CNFSM4KYANWIKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4IOYAICA, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACKZTGVRZIX3S3U4N3VZ7DRDWEH7ANCNFSM4KYANWIA .
I've had a thought that I'd like some feedback on before I investigate further.
Would it be possible to start a notebook kernel on a Dask worker and connect to it from Jupyter Lab? Using an approach like this, but not necessarily using SSH to proxy the connection.
Ensuring environments are consistent between client and workers can often be frustrating, especially when a user has Jupyter Lab in a local conda environment on their laptop but is launching a cluster somewhere like Kubernetes where the workers will be using a Docker image.
Running a kernel on a worker may be one workaround/solution to this.
I would imagine the workflow to be something like:
@ian-r-rose @mrocklin