Closed lhoupert closed 3 years ago
Hi, so it worked for jupyter-notebook : 6.1.6? I have never used jupyter-lab yet, so not sure what I might need to do to make this compatible. I can have a look into it when I get a moment - will let you know how it goes.
Hi! Yes it works with jupyter-notebook 6.1.6.
Thanks!
Someone contributed some improved code a couple months ago, I only just merged it now after your question reminded me. You could give it a go by updating from pypi. Let me know if it works :)
Thanks! I just updated the package to version Version: 2021.3.1
but I didn't manage to make it work with jupyter-lab 3.0.1. However, someone seems to make it work with the previous version of jupyter-lab https://stackoverflow.com/a/65907473/13890678
I just tried with jupyter-lab 2.2.6 and it is working. So it must be due to a change in jupyter-lab for version 3+
It should be working with jupyterlab v3 now (thanks to @flying-sheep) - updated on pypi. Once you check, please could you post back with result and close the issue if all OK?
Thanks @msm1089 and @flying-sheep , it works! I love the open source community!
On my system, jupyterlab 3.3.4 on Windows, it takes a whopping 1/2 MINUTE to run ipynbname.name()
!!! Yes, no typo, 30 seconds!
Upon forking ipynbname and experimenting with it, I found that the fault lies with the lines:
for file_name in chain( runtime_dir.glob('nbserver-*.json'), # jupyter notebook (or lab 2) runtime_dir.glob('jpserver-*.json'), # jupyterlab 3 ):
The part for "jupyter notebook (or lab 2)" is a dead weight that adds a LOT (dozens and dozens) of entries to the results of _list_maybe_running_servers()
Those are all flagged as "stale" entries in the function _find_nb_path()
. To examine each of them, it takes a lots of calls to _get_sessions()
, which eats up that ridiculous amount of time!
By simply ditching the useless line (to users of jupyterlab 3):
runtime_dir.glob('nbserver-*.json'), # jupyter notebook (or lab 2)
then the problem goes away, and the execution times becomes almost instant!
The original repo file for ipynbname is: https://github.com/msm1089/ipynbname/blob/master/ipynbname/__init__.py
A strategy to fix that would be to simply switch the two lines so the lab 3 ones are tried first.
Another strategy is implemented in #11.
Also is there a way to clean stale servers? Other than manually deleting those files?
A complication, since my earlier post! :/
I ran my Jupyterlab notebooks on Binder (mybinder.org)... I confirmed it's version 3:
!jupyter --version
Selected Jupyter core packages... IPython : 7.32.0 ipykernel : 6.12.1 ipywidgets : 7.7.0 jupyter_client : 7.2.2 jupyter_core : 4.9.2 jupyter_server : 1.16.0 jupyterlab : 3.3.4 nbclient : 0.5.13 nbconvert : 6.4.5 nbformat : 5.3.0 notebook : 6.4.10 qtconsole : not installed traitlets : 5.1.1
and it does NOT work!
Diagnostic print statements I inserted in my forked version of ipynbname
show:
runtime_dir: /home/jovyan/.local/share/jupyter/runtime
and see what happens when I look into that directory (from within the notebook):
%ls -l /home/jovyan/.local/share/jupyter/runtime
total 16 -rw------- 1 jovyan jovyan 263 Jun 13 22:20 kernel-2c3fb78e-11bd-4524-a7a3-39dcdac3c940.json -rw-r--r-- 1 jovyan jovyan 299 Jun 13 21:53 nbserver-25.json -rw-r--r-- 1 jovyan jovyan 688 Jun 13 21:53 nbserver-25-open.html -rw------- 1 jovyan jovyan 45 Jun 13 21:53 notebook_cookie_secret
Instead of the expected jpserver-*.json
files, we have nb server files!!
An inconsistency in the file naming :o Maybe a Windows/Linux issue??
I ended up creating a version of ipynbname
, which I named ipynbname3
, that first tries the "jpserver" names... and, in case of no hits, it next tries the "nbserver" names.
It runs fast on both my local windows machine, and on Binder.
The code is distributed with the "Life123" open source project. The direct link is: https://github.com/BrainAnnex/life123/blob/main/experiments/ipynbname3.py (in case it later gets moved, look for it under the project's repo: https://github.com/BrainAnnex/life123 )
I urge the maintainers of Jupyter Lab to:
@flying-sheep - I had tried your suggested approach to just reverse those 2 lines. It's ok, but still takes a few seconds to run on Window... not sure why. The new version, in my last post, by contrast seems to run instantly on both my local windows machine, and on Binder.
You made a great point about "is there a way to clean stale servers? Other than manually deleting those files?" That's a root cause of why the original code runs so slow on my Windows machine! Somehow, I have accumulated stale files from YEARS back!
I note that the thread you posted a link to, https://github.com/msm1089/ipynbname/pull/11 , was discussing prioritizing recent servers...
A 3rd suggestion to the maintainers of Jupyter Lab is:
Yeah, I think jupyter lab should allow you to do this.
Hi, I was wondering if there are plans to make this work in jupyter-lab? It would be great. I didn't find any "easy" solution for jupyter-lab so I opened a Stackoverflow post about it [here], this is where I discovered your package.