Closed robknapen closed 10 months ago
And same for the 'retrying' and 'pyspark' (requiring Apache Spark to be installed as well) packages.
Even though the server is a single node, it can help with writing/experimenting with distributed code and maybe make better use of the available CPU cores on the node.
[Stefan Achtsnit]1 hour ago
you need to install dask in the conda kernel what you see on the left is just the extension allowing to open the dash dashboard
Ok, thanks for the reply.
In FAIRiCUBE Hub Jupyter there is a tab in JupyterLab for Dask, but when I try to import Dask packages in the fairicubeuc2-torch kernel they are not available. Can they please be added (perhaps also pertains to other kernels)?