Closed hakasapl closed 4 years ago
Hey @hakasapl
Thank you for reaching out. This package depends on nb_conda_kernels
that does that.
Could you try the following command in the base environment to see if it lists the user conda environment?
python -m nb_conda_kernels list
If it does not, you may want to try the enabling command:
python -m nb_conda_kernels.install --enable
I do see my environments there:
$ python3 -m nb_conda_kernels list
[ListKernelSpecs] [nb_conda_kernels] enabled, 1 kernels found
Available kernels:
conda-env-.conda-Test-py /home/<user>/.conda/envs/test/share/jupyter/kernels/python3
python3 /modules/apps/python/3.8.5-jhub/share/jupyter/kernels/python3
But still nothing in the launcher.
Maybe the kernels are in the wrong spot? User-side kernels I would usually install in ~/.local/share/jupyter/kernels
, not ~/.conda/envs
. JupyterLab does not run in a conda environment, it's a traditional install with PyPi.
JupyterLab does not run in a conda environment, it's a traditional install with PyPi.
Ok this is the reason it is not working out of the box. So nb_conda_kernels works by setting the kernelspecmanager of JupyterLab server:
NotebookApp.kernel_spec_manager_class = "nb_conda_kernels.CondaKernelSpecManager"
So nb_conda_kernels needs to be discoverable within the JupyterLab environment. So the easiest is to install JupyterLab and that package as the base conda environment.
I had installed nb_conda_kernels in the same Python installation that JupyterLab is, so it should be discoverable, right? Is there any way to change where nb_conda_kernels saves the kernelspecs?
Let me set this up in a conda environment instead, and I'll get back to you.
That worked! Not sure if this is a bug or not, but if it is I can start another issue (probably more of a nb_conda_kernels issue): When I create an environment the kernel shows up right away in the launcher, but it doesn't go away when I delete an environment until after I restart my jupyterlab session. Really minor, but let me know if you want me to spin up another issue for that. I'm closing this for now, thanks!
When I create an environment the kernel shows up right away in the launcher, but it doesn't go away when I delete an environment until after I restart my jupyterlab session.
There is different pieces of code at work here. So first nb_conda_kernels as a cache time to not refresh too often. Then JupyterLab itself is requesting the kernel specs manager periodically - you can track the network request to kernelspecs
endpoint (if I remember correctly the interval for the latter is longer than the cache time). And then the launcher needs to be notified to be updated. So the all process takes indeed some time. But it should work. Did you try to open a fresh launcher after 2 or 3 minutes?
I don't think I waited that long, maybe a minute at the most, but that makes sense. Thanks for letting me know!
Description
Is it possible to have the jupyterlab extension automatically create a kernelspec for the user when a new environment is created? I'm working on deploying this on an HPC cluster, and I'd like to keep it as user-friendly as possible.
I can add global kernelspecs for everyone at the python install location, and I can also add kernelspecs in my
~/.local/share/jupyter
folder that just show up for one user. For a local conda environment, I usually add a kernelspec like this within a conda environment:ipykernel install --user --name testName --display-name="Display Name Within JupyterLab"
. Is it possible to have jupyter_conda run something like that when a environment is created? Currently I see the created environments withconda info --envs
, but no kernel within JupyterLab.Proposed Steps
Context
JupyterLab instances are launched from JupyterHub, all of which are in a centralized location. General users don't have write access to the install location. So when a user creates a conda environment, it rightfully goes into their home directory: