neurolibre / neurolibre-reviews

Where NeuroLibre reviews live.
https://neurolibre.org
3 stars 1 forks source link

[PRE REVIEW]: Quantitative T1 MRI #2

Closed roboneuro closed 1 year ago

roboneuro commented 3 years ago

Submitting author: !--author-handle-->@mathieuboudreau<!--end-author-handle-- (Mathieu Boudreau) Repository: https://github.com/qMRLab/t1-book-neurolibre Branch with paper.md (empty if default branch): main Version: v1.0.0 Editor: !--editor-->@agahkarakuzu<!--end-editor-- Reviewers: !--reviewers-list-->@agahkarakuzu<!--end-reviewers-list-- Reproducible preprint: Pending Repository archive: Pending Data archive: Pending Book archive: Pending Docker archive: Pending

Author instructions

Thanks for submitting your paper to NeuroLibre @mathieuboudreau.

@mathieuboudreau if you have any suggestions for potential reviewers then please mention them here in this thread (without tagging them with an @). In addition, this list of people have already agreed to review for NeuroLibre and may be suitable for this submission (please start at the bottom of the list).

Editor instructions

The NeuroLibre submission bot @roboneuro is here to help you find and assign reviewers and start the main review. To find out what @roboneuro can do for you type:

@roboneuro commands
mathieuboudreau commented 3 years ago

Ok. Well, as of next Monday, I am working two days a week for another lab, so I will have to try and debug this issue when I can find the time between that and my other tasks/meetings the three other days. I had hoped that submitting with pre-built notebooks would speed up the submission process (as now the jupyter book build itself is broken) and that we were near it, but it looks like this will take some more time. I'll update you when I have made any progress debugging this – I probably won't have more than a couple of hours next week to start digging into this.

mathieuboudreau commented 3 years ago

If @agahkarakuzu has more time available, maybe he can explore this sooner than I can

ltetrel commented 3 years ago

No problem @mathieuboudreau, I will also look at it. Specifically comparing the built docker image (in the registry), and a local version of yours.

ltetrel commented 3 years ago

@mathieuboudreau @agahkarakuzu Alright so I have tried to run the jupyter-book build content/ in several environments:

  1. After pulling the image from our docker registry binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-54ed4d:330da581e4154314d693c61648bdc7cfcf3a676a
  2. After building the image locally:
    git clone -b neurolibre https://github.com/qMRLab/t1_book
    cd binder
    sudo docker build --no-cache --tag=qmrlab-2dt1-5fbook-54ed4d:330da581e4154314d693c61648bdc7cfcf3a676a .
  3. Directly on Mybinder environment (using the terminal): https://mybinder.org/v2/gh/qMRLab/t1_book/neurolibre

All of these methods returned the same error mentioned above. I would suggest to compare the local environment (if on your PC your notebooks are correctly executed?) VS the environment inside the Docker (you can pull binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-54ed4d:330da581e4154314d693c61648bdc7cfcf3a676a).

mathieuboudreau commented 3 years ago

Here is a 10 minute screen-captured demo of my IR notebooks all executing correctly on MyBinder with my current setup on the neurolibre branch: https://youtu.be/1oCLPyCht-Y

I conclude two things from this:

1 - My Dockerfile is well-designed to execute my notebooks from within a built container image. 2 - My notebooks execute correctly and generates the Plotly figures.

Also, because the Jupyter Book built correctly before with the notebooks pre-executed,

3 - My Jupyter Book config file is well-designed to build the book using our previous method.

The problem then must be related to Jupyter Book/RoboNeuro/NeuroLibre.

My repo follows the guidelines as described in the NeuroLibre submission documentation.

Here is a direct link to my image on MyBinder so you can test for yourself: https://mybinder.org/v2/gh/qMRLab/t1_book/neurolibre?urlpath=lab

mathieuboudreau commented 3 years ago

I am still open to reverting to our previous method, where the Jupyter Book was built using pre-executed notebooks. But that's not my call, as I am the author in this case and not one of the testers/editors.

emdupre commented 3 years ago

Can you share the results from pip freeze in the Dockerized execution environment, @mathieuboudreau ? I'm particularly interested in your nbclient version.

mathieuboudreau commented 3 years ago

Here's the output @emdupre:

```shell jovyan@jupyter-qmrlab-2dt1-5fbook-2dob1c3tui:~/work/t1_book$ pip freeze alabaster==0.7.12 alembic==0.9.9 annexremote==1.5.0 anyio==3.3.2 appdirs==1.4.3 argon2-cffi==21.1.0 asn1crypto==0.24.0 async-generator==1.10 attrs==21.2.0 Automat==0.0.0 awscli==1.20.52 Babel==2.9.1 backcall==0.1.0 beautifulsoup4==4.10.0 bleach==2.1.4 boto==2.49.0 botocore==1.21.52 certifi==2021.5.30 cffi==1.11.5 chardet==3.0.4 click==8.0.1 colorama==0.4.3 conda==4.5.8 constantly==15.1.0 contextvars==2.4 cryptography==2.2.1 cycler==0.10.0 dataclasses==0.8 datalad==0.15.1 decorator==4.3.0 defusedxml==0.7.1 Deprecated==1.2.13 docutils==0.17.1 entrypoints==0.2.3 fasteners==0.16.3 feather-format==0.4.1 filelock==3.2.0 Flask==2.0.1 future==0.18.2 gdown==3.14.0 gitdb==4.0.7 GitPython==3.1.18 greenlet==1.1.2 html5lib==1.0.1 humanize==3.11.0 hyperlink==17.3.1 idna==2.7 imagesize==1.2.0 immutables==0.16 importlib-metadata==4.8.1 importlib-resources==3.3.1 incremental==17.5.0 ipykernel==4.8.2 ipython==6.5.0 ipython-genutils==0.2.0 ipywidgets==7.6.5 iso8601==0.1.16 itsdangerous==2.0.1 jedi==0.12.1 jeepney==0.7.1 Jinja2==2.10 jmespath==0.10.0 json5==0.9.6 jsonschema==2.6.0 jupyter-book==0.10.0 jupyter-cache==0.4.3 jupyter-client==5.2.3 jupyter-core==4.4.0 jupyter-server==1.11.0 jupyter-server-mathjax==0.2.3 jupyter-sphinx==0.3.1 jupyterhub==0.9.2 jupyterlab==2.2.0 jupyterlab-launcher==0.13.1 jupyterlab-server==1.2.0 jupyterlab-widgets==1.0.2 jupytext==1.13.0 keyring==23.2.1 keyrings.alt==4.1.0 kiwisolver==1.3.1 latexcodec==2.0.1 linkify-it-py==1.0.1 Mako==1.0.7 Markdown==3.3.4 markdown-it-py==1.1.0 MarkupSafe==1.0 matplotlib==3.3.4 mdit-py-plugins==0.2.8 metakernel==0.27.5 mistune==0.8.3 msgpack==1.0.2 myst-nb==0.11.1 myst-parser==0.13.7 nbclient==0.5.4 nbconvert==5.4.0 nbdime==3.1.0 nbformat==4.4.0 nest-asyncio==1.5.1 nested-lookup==0.2.22 networkx==2.5.1 notebook==5.6.0 numpy==1.19.5 octave-kernel==0.32.0 osfclient==0.0.5 packaging==21.0 pamela==0.3.0 pandas==1.1.5 pandocfilters==1.4.2 parso==0.3.1 patool==1.12 pexpect==4.6.0 pickleshare==0.7.4 Pillow==8.3.2 plotly==3.10.0 prometheus-client==0.3.0 prompt-toolkit==1.0.15 psutil==5.8.0 ptyprocess==0.6.0 pyarrow==5.0.0 pyasn1==0.4.4 pyasn1-modules==0.2.1 pybtex==0.24.0 pybtex-docutils==1.0.1 pycosat==0.6.3 pycparser==2.18 pycurl==7.43.0.2 pydata-sphinx-theme==0.6.3 pydot==1.4.2 pydotplus==2.0.2 PyGithub==1.55 Pygments==2.2.0 PyJWT==2.1.0 PyNaCl==1.4.0 pyOpenSSL==18.0.0 pyparsing==2.4.7 PySocks==1.6.8 python-dateutil==2.7.3 python-editor==1.0.3 python-gitlab==2.10.1 python-oauth2==1.0.1 pytz==2021.1 PyYAML==5.4.1 pyzmq==17.1.2 repo2data==2.4.4 requests==2.19.1 requests-toolbelt==0.9.1 requests-unixsocket==0.2.0 retrying==1.3.3 rsa==4.7.2 ruamel-yaml==0.15.44 s3transfer==0.5.0 sas7bdat==2.2.3 saspy==3.7.5 scipy==1.5.4 SecretStorage==3.3.1 Send2Trash==1.5.0 service-identity==17.0.0 simplegeneric==0.8.1 simplejson==3.17.5 six==1.11.0 smmap==4.0.0 sniffio==1.2.0 snowballstemmer==2.1.0 sos==0.17.7 sos-bash==0.12.3 sos-javascript==0.9.12.2 sos-julia==0.9.12.1 sos-matlab==0.9.12.1 sos-notebook==0.17.2 sos-python==0.9.12.1 sos-r==0.9.12.2 sos-ruby==0.9.15.0 sos-sas==0.9.12.3 soupsieve==2.2.1 Sphinx==3.5.4 sphinx-book-theme==0.1.5 sphinx-comments==0.0.3 sphinx-copybutton==0.4.0 sphinx-panels==0.5.2 sphinx-thebe==0.0.10 sphinx-togglebutton==0.2.3 sphinxcontrib-applehelp==1.0.2 sphinxcontrib-bibtex==2.1.4 sphinxcontrib-devhelp==1.0.2 sphinxcontrib-htmlhelp==2.0.0 sphinxcontrib-jsmath==1.0.1 sphinxcontrib-qthelp==1.0.3 sphinxcontrib-serializinghtml==1.1.5 SQLAlchemy==1.4.25 tabulate==0.8.9 terminado==0.8.1 testpath==0.3.1 toml==0.10.2 tornado==5.1 tqdm==4.62.3 traitlets==4.3.2 Twisted==18.7.0 typing-extensions==3.10.0.2 uc-micro-py==1.0.1 urllib3==1.23 Wand==0.6.7 wcwidth==0.1.7 webencodings==0.5 websocket-client==1.2.1 Werkzeug==2.0.1 wget==3.2 Whoosh==2.7.4 widgetsnbextension==3.5.1 wrapt==1.12.1 xxhash==2.0.2 zipp==3.6.0 zope.interface==4.5.0 jovyan@jupyter-qmrlab-2dt1-5fbook-2dob1c3tui:~/work/t1_book$ ```
emdupre commented 3 years ago

Thanks, @mathieuboudreau ! That's quite an old version of jupyter-client, and this error should have been resolved as of 6.1.3. I'm wondering if this is occurring because of the interaction of the pinned v unpinned dependencies and the chosen pip solver. The docker logs suggest as much :

``` ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts. twisted 18.7.0 requires PyHamcrest>=1.9.0, which is not installed. twisted 18.7.0 requires Automat>=0.3.0, but you'll have automat 0.0.0 which is incompatible. flask 2.0.1 requires Jinja2>=3.0, but you'll have jinja2 2.10 which is incompatible. jupyterlab-server 1.2.0 requires jsonschema>=3.0.1, but you'll have jsonschema 2.6.0 which is incompatible. sphinx 3.5.4 requires docutils<0.17,>=0.12, but you'll have docutils 0.17.1 which is incompatible. pydata-sphinx-theme 0.6.3 requires docutils<0.17, but you'll have docutils 0.17.1 which is incompatible. sphinx-book-theme 0.1.5 requires click~=7.1, but you'll have click 8.0.1 which is incompatible. sphinx-book-theme 0.1.5 requires docutils<0.17,>=0.15, but you'll have docutils 0.17.1 which is incompatible. sphinx-panels 0.5.2 requires importlib-resources~=3.0.0; python_version < "3.7", but you'll have importlib-resources 3.3.1 which is incompatible. anyio 3.3.2 requires idna>=2.8, but you'll have idna 2.7 which is incompatible. jupyter-server 1.11.0 requires jupyter-client>=6.1.1, but you'll have jupyter-client 5.2.3 which is incompatible. jupyter-server 1.11.0 requires jupyter-core>=4.6.0, but you'll have jupyter-core 4.4.0 which is incompatible. jupyter-server 1.11.0 requires terminado>=0.8.3, but you'll have terminado 0.8.1 which is incompatible. jupyter-server 1.11.0 requires tornado>=6.1.0, but you'll have tornado 5.1 which is incompatible. nbclient 0.5.4 requires jupyter-client>=6.1.5, but you'll have jupyter-client 5.2.3 which is incompatible. nbclient 0.5.4 requires nbformat>=5.0, but you'll have nbformat 4.4.0 which is incompatible. myst-parser 0.13.7 requires markdown-it-py~=0.6.2, but you'll have markdown-it-py 1.1.0 which is incompatible. jupyter-sphinx 0.3.1 requires nbconvert>=5.5, but you'll have nbconvert 5.4.0 which is incompatible. myst-nb 0.11.1 requires nbconvert~=5.6, but you'll have nbconvert 5.4.0 which is incompatible. myst-nb 0.11.1 requires nbformat~=5.0, but you'll have nbformat 4.4.0 which is incompatible. jupyter-book 0.10.0 requires jupytext~=1.8.0, but you'll have jupytext 1.13.0 which is incompatible. botocore 1.21.53 requires urllib3<1.27,>=1.25.4, but you'll have urllib3 1.23 which is incompatible. awscli 1.20.53 requires docutils<0.16,>=0.10, but you'll have docutils 0.17.1 which is incompatible. python-gitlab 2.10.1 requires requests>=2.25.0, but you'll have requests 2.19.1 which is incompatible. ```

I'm currently not well set-up to do extensive testing, but I imagine that the problem is that your current requirements conflict with something JB needs in its build process (likely jupyter-client, or nbclient). This may not affect your notebooks themselves if you don't use that dependency in your execution.

As a first pass, could we update the pip solver ? Or, if that's not possible, could we unpin the versions for nbconvert and jupyter-lab and let pip solve to its preferred version ?

mathieuboudreau commented 3 years ago

The only main concern is that, if I remember correctly from 2 years ago, SoS (or at least, the version I'm using) was only made compatible with a specific range of jupyter-lab, such that the SoS kernel might not work if I upgrade that. We can certainly try to unpin these, but it might bring on other issues.

Maybe a separate branch to test these on would be appropriate?

mathieuboudreau commented 3 years ago

We could also definitely update the pip solver.

mathieuboudreau commented 3 years ago

Ok @emdupre, I made your requested changes in the neurolibre-unpin branch.

mathieuboudreau commented 3 years ago

@roboneuro generate nl-notebook from branch neurolibre-unpin

roboneuro commented 3 years ago
Attempting NeuroLibre notebook compilation from custom branch neurolibre-unpin. Reticulating splines etc...
roboneuro commented 3 years ago

:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:

roboneuro commented 3 years ago

:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:

emdupre commented 3 years ago

OK, this failed ! I think because of the compute time. I'll look into this more and try to give a more graceful failure soon.

Logs:

>2021-10-02T01:15:57.959471+00:00 app[worker.1]: 4 TID-gnz5rplrg INFO: Pushed 1 jobs back to Redis 2021-10-02T01:15:57.959916+00:00 app[worker.1]: 4 TID-gnz6qqiog JBWorker JID-dd80a11aca679e86c57b57be INFO: fail: 1376.01 sec 2021-10-02T01:15:57.961026+00:00 app[worker.1]: 4 TID-gnz5rplrg INFO: Bye! 2021-10-02T01:15:58.113896+00:00 heroku[worker.1]: Process exited with status 0 2021-10-02T01:15:58.323567+00:00 app[worker.1]: http://neurolibre-data.conp.cloud:8081/api/v1/resources/books?commit_hash=2f96ed333b93a3eb67976cc06964d5a6d79014c3 2021-10-02T01:15:58.631122+00:00 app[worker.1]: Requested resource does not exist. 2021-10-02T01:15:59.245337+00:00 app[worker.1]: hit 409 2021-10-02T01:15:59.245366+00:00 app[worker.1]: 2f96ed333b93a3eb67976cc06964d5a6d79014c3 2021-10-02T01:15:59.245389+00:00 app[worker.1]: http://neurolibre-data.conp.cloud:8081/api/v1/resources/books?commit_hash=2f96ed333b93a3eb67976cc06964d5a6d79014c3 2021-10-02T01:15:59.507150+00:00 app[worker.1]: Requested resource does not exist. 2021-10-02T01:15:59.507299+00:00 app[worker.1]: Returning the latest successful book build 2021-10-02T01:15:59.507441+00:00 app[worker.1]: http://neurolibre-data.conp.cloud:8081/api/v1/resources/books?repo_name=t1_book 2021-10-02T01:15:59.938672+00:00 app[worker.1]: 4 TID-gnqxaq9gg JBWorker JID-dd80a11aca679e86c57b57be INFO: fail: 1.981 sec 2021-10-02T01:15:59.938876+00:00 app[worker.1]: 4 TID-gnqxaq9gg WARN: {"context":"Job raised exception"} 2021-10-02T01:15:59.939523+00:00 app[worker.1]: 4 TID-gnqxaq9gg WARN: NoMethodError: undefined method `[]' for nil:NilClass 2021-10-02T01:15:59.939565+00:00 app[worker.1]: 4 TID-gnqxaq9gg WARN: /app/workers.rb:849:in `perform'

EDIT: and we're seeing the same failure on the preview page, just as confirmation that these two are now in sync !

Screenshot_20211001-221510.png

mathieuboudreau commented 3 years ago

MyBinder was powerful enough to build it: https://mybinder.org/v2/gh/qMRLab/t1_book/neurolibre-unpin?urlpath=lab

Screen Shot 2021-10-02 at 1 19 15 AM Screen Shot 2021-10-02 at 1 19 42 AM
mathieuboudreau commented 3 years ago

However, as I expected, the updated version of JupyterLab and nbconvert rendered the SoS kernel incompatible, as it's not showing up anymore. Meaning, the notebooks won't be able to execute with this current setup, even if we fix the slow/underpowered NeuroLibre's BinderHub setup that's preventing the build to complete there.

Screen Shot 2021-10-02 at 1 21 12 AM
ltetrel commented 3 years ago

@mathieuboudreau The notebooks execute ok in the binder environment, but here is what I get running jupyter-book inside the binder terminal (MyBinder). This indeed could be related to the jupyter book rendering itself ? image

I also tried to execute separately with jupyter nbconvert --to notebook --execute content/01/IR_BenefitsAndPitfalls.ipynb (instead of jupyter-book build content, and the error seems different: image

ltetrel commented 3 years ago

@emdupre Not sure why the error page is empty, because the log file exists here: https://neurolibre-data.conp.cloud/book-artifacts/qMRLab/github.com/t1_book/baeab5445fe59a151c896c7359e44f7b45190c80/book-build.log As we can see, there is still the same issue as above with the jupyter-book build.

emdupre commented 3 years ago

Yes, both the preview page and the roboneuro-bot didn't have a nice mechanism to display time-out errors. That's something that can be added !

ltetrel commented 3 years ago

Ok it is not clear for me what @mathieuboudreau issue was with the logging from RoboNeuro. Was it because of the empty output of the page ? Or because you did not receive any mails from RoboNeuro ? In any case the empty output on the page is problematic and confusing for the end user (we already raised that issues few weeks ago), but we can discuss it fast tomorrow.

mathieuboudreau commented 3 years ago

Update

I've removed all restrictions in my pip version (e.g. sos, plotly, jupyter book, nbviewer, etc). I've also updated the Docker image that we're pulling from a fixed version to latest.

I think I have three things to resolve now

Screen Shot 2021-10-20 at 12 33 13 PM Screen Shot 2021-10-20 at 12 33 19 PM

One last bug I found happening occasionally (not all the time), is that sometimes the transfer of data/environment from Octave to Python failed, but it happens sporatically (rerunning will work for some reason). Maybe it's an issue that it's running the cells too quickly.

After all of this is done, I don't think I can do more improvements as every dependency will be up to date and the simulations have been drastically shortened in time. If RoboNeuro/NeuroLibre still doesn't function correctly during the submission of this notebook after all these changes, then either 1) changes will need to be made on the NeuroLibre/RoboNeuro side to accommodate our book's needs, or 2) we close the submission of this book.

I'll post again once the remaining to-do list is complete.

mathieuboudreau commented 3 years ago

p.s. the branch that I'm working on now is neurolibre-unleashed

mathieuboudreau commented 3 years ago

I've updated my TOC (058ae1a) and attempted a Jupyter Book build, which fails while executing the notebooks.

CLI log

``` (base) jovyan@jupyter-qmrlab-2dt1-5fbook-2di83ukryw:~/work/t1_book$ jupyter-book build content/ Running Jupyter-Book v0.12.0 Source Folder: /home/jovyan/work/t1_book/content Config Path: /home/jovyan/work/t1_book/content/_config.yml Output Path: /home/jovyan/work/t1_book/content/_build/html Running Sphinx v4.2.0 making output directory... done [etoc] Changing master_doc to 'intro' [etoc] Excluded 2 extra file(s) not in toc WARNING: multiple files found for the document "01/IR_References": ['01/IR_References.md', '01/IR_References.ipynb'] Use '/home/jovyan/work/t1_book/content/01/IR_References.md' for the build. myst v0.15.2: MdParserConfig(renderer='sphinx', commonmark_only=False, enable_extensions=['html_image', 'amsmath', 'colon_fence', 'deflist', 'dollarmath', 'html_admonition', 'linkify', 'replacements', 'smartquotes', 'substitution'], dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', disable_syntax=[], url_schemes=['mailto', 'http', 'https'], heading_anchors=None, heading_slug_func=None, html_meta=[], footnote_transition=True, substitutions=[], sub_delimiters=['{', '}'], words_per_minute=200) building [mo]: targets for 0 po files that are out of date building [html]: targets for 12 source files that are out of date updating environment: [new config] 12 added, 0 changed, 0 removed Executing: 01/IR_BenefitsAndPitfalls in: /home/jovyan/work/t1_book/content/01 [MetaKernelApp] ERROR | Exception in message handler: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/metakernel/replwrap.py", line 168, in _expect_prompt_stream pos = self.child.expect(expects, timeout=stream_timeout) File "/opt/conda/lib/python3.9/site-packages/pexpect/spawnbase.py", line 343, in expect return self.expect_list(compiled_pattern_list, File "/opt/conda/lib/python3.9/site-packages/pexpect/spawnbase.py", line 372, in expect_list return exp.expect_loop(timeout) File "/opt/conda/lib/python3.9/site-packages/pexpect/expect.py", line 181, in expect_loop return self.timeout(e) File "/opt/conda/lib/python3.9/site-packages/pexpect/expect.py", line 144, in timeout raise exc pexpect.exceptions.TIMEOUT: Timeout exceeded. command: /usr/bin/octave-cli args: [b'/usr/bin/octave-cli', b'--interactive', b'--quiet', b'--no-init-file'] buffer (last 100 chars): '' before (last 100 chars): '' after: match: None match_index: None exitstatus: None flag_eof: False pid: 162 child_fd: 58 closed: False timeout: 30 delimiter: logfile: None logfile_read: None logfile_send: None maxread: 2000 ignorecase: False searchwindowsize: None delaybeforesend: None delayafterclose: 0.1 delayafterterminate: 0.1 searcher: searcher_re: 0: re.compile('PEXPECT_PROMPT>') 1: re.compile('PEXPECT_PROMPT_') 2: re.compile('\\A.+?__stdin_prompt>|debug> ') During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/ipykernel/kernelbase.py", line 353, in dispatch_shell await result File "/opt/conda/lib/python3.9/site-packages/ipykernel/kernelbase.py", line 643, in execute_request reply_content = self.do_execute( File "/opt/conda/lib/python3.9/site-packages/metakernel/_metakernel.py", line 389, in do_execute retval = self.do_execute_direct(code) File "/opt/conda/lib/python3.9/site-packages/octave_kernel/kernel.py", line 109, in do_execute_direct val = ProcessMetaKernel.do_execute_direct(self, code, silent=silent) File "/opt/conda/lib/python3.9/site-packages/metakernel/process_metakernel.py", line 85, in do_execute_direct output = wrapper.interrupt() File "/opt/conda/lib/python3.9/site-packages/octave_kernel/kernel.py", line 409, in _interrupt return REPLWrapper.interrupt(self.repl, continuation=continuation) File "/opt/conda/lib/python3.9/site-packages/metakernel/replwrap.py", line 266, in interrupt self._expect_prompt(timeout=-1) File "/opt/conda/lib/python3.9/site-packages/metakernel/replwrap.py", line 125, in _expect_prompt return self._expect_prompt_stream(expects, timeout) File "/opt/conda/lib/python3.9/site-packages/metakernel/replwrap.py", line 173, in _expect_prompt_stream self.child.expect([u('\r')], timeout=0) File "/opt/conda/lib/python3.9/site-packages/pexpect/spawnbase.py", line 343, in expect return self.expect_list(compiled_pattern_list, File "/opt/conda/lib/python3.9/site-packages/pexpect/spawnbase.py", line 372, in expect_list return exp.expect_loop(timeout) File "/opt/conda/lib/python3.9/site-packages/pexpect/expect.py", line 169, in expect_loop incoming = spawn.read_nonblocking(spawn.maxread, timeout) File "/opt/conda/lib/python3.9/site-packages/pexpect/pty_spawn.py", line 458, in read_nonblocking if select(0): File "/opt/conda/lib/python3.9/site-packages/pexpect/pty_spawn.py", line 450, in select return select_ignore_interrupts([self.child_fd], [], [], timeout)[0] File "/opt/conda/lib/python3.9/site-packages/pexpect/utils.py", line 143, in select_ignore_interrupts return select.select(iwtd, owtd, ewtd, timeout) OSError: [Errno 9] Bad file descriptor WARNING: Execution Failed with traceback saved in /home/jovyan/work/t1_book/content/_build/html/reports/IR_BenefitsAndPitfalls.log Exception occurred: File "/opt/conda/lib/python3.9/site-packages/sos_notebook/converter.py", line 152, in from_notebook_node resources['output_extension'] = '.sos' TypeError: 'NoneType' object does not support item assignment The full traceback has been saved in /tmp/sphinx-err-l7wvujzm.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/jupyter_book/sphinx.py", line 167, in build_sphinx app.build(force_all, filenames) File "/opt/conda/lib/python3.9/site-packages/sphinx/application.py", line 343, in build self.builder.build_update() File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 293, in build_update self.build(to_build, File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 307, in build updated_docnames = set(self.read()) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 414, in read self._read_serial(docnames) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 435, in _read_serial self.read_doc(docname) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 475, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File "/opt/conda/lib/python3.9/site-packages/sphinx/io.py", line 189, in read_doc pub.publish() File "/opt/conda/lib/python3.9/site-packages/docutils/core.py", line 217, in publish self.document = self.reader.read(self.source, self.parser, File "/opt/conda/lib/python3.9/site-packages/sphinx/io.py", line 109, in read self.parse() File "/opt/conda/lib/python3.9/site-packages/docutils/readers/__init__.py", line 78, in parse self.parser.parse(self.input, document) File "/opt/conda/lib/python3.9/site-packages/myst_nb/parser.py", line 84, in parse path_doc = nb_output_to_disc(ntbk, document) File "/opt/conda/lib/python3.9/site-packages/myst_nb/parser.py", line 303, in nb_output_to_disc write_notebook_output(ntbk, str(output_dir), doc_filename) File "/opt/conda/lib/python3.9/site-packages/jupyter_sphinx/execute.py", line 290, in write_notebook_output contents, resources = exporter.from_notebook_node(notebook) File "/opt/conda/lib/python3.9/site-packages/nbconvert/exporters/script.py", line 56, in from_notebook_node return exporter.from_notebook_node(nb, resources, **kw) File "/opt/conda/lib/python3.9/site-packages/sos_notebook/converter.py", line 152, in from_notebook_node resources['output_extension'] = '.sos' TypeError: 'NoneType' object does not support item assignment The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/conda/bin/jupyter-book", line 8, in sys.exit(main()) File "/opt/conda/lib/python3.9/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/opt/conda/lib/python3.9/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/opt/conda/lib/python3.9/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/opt/conda/lib/python3.9/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/opt/conda/lib/python3.9/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/opt/conda/lib/python3.9/site-packages/jupyter_book/cli/main.py", line 319, in build builder_specific_actions( File "/opt/conda/lib/python3.9/site-packages/jupyter_book/cli/main.py", line 519, in builder_specific_actions raise RuntimeError(_message_box(msg, color="red", doprint=False)) from result RuntimeError: =============================================================================== There was an error in building your book. Look above for the cause. =============================================================================== (base) jovyan@jupyter-qmrlab-2dt1-5fbook-2di83ukryw:~/work/t1_book$ ```

/tmp/sphinx-err-l7wvujzm.log

``` (base) jovyan@jupyter-qmrlab-2dt1-5fbook-2di83ukryw:~/work/t1_book$ cat /tmp/sphinx-err-l7wvujzm.log # Sphinx version: 4.2.0 # Python version: 3.9.7 (CPython) # Docutils version: 0.17.1 release # Jinja2 version: 2.11.3 # Last messages: # [etoc] Changing master_doc to 'intro' # [etoc] Excluded 2 extra file(s) not in toc # myst v0.15.2: MdParserConfig(renderer='sphinx', commonmark_only=False, enable_extensions=['html_image', 'amsmath', 'colon_fence', 'deflist', 'dollarmath', 'html_admonition', 'linkify', 'replacements', 'smartquotes', 'substitution'], dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', disable_syntax=[], url_schemes=['mailto', 'http', 'https'], heading_anchors=None, heading_slug_func=None, html_meta=[], footnote_transition=True, substitutions=[], sub_delimiters=['{', '}'], words_per_minute=200) # building [mo]: targets for 0 po files that are out of date # building [html]: targets for 12 source files that are out of date # updating environment: # [new config] # 12 added, 0 changed, 0 removed # reading sources... [ 8%] 01/IR_BenefitsAndPitfalls # Executing: 01/IR_BenefitsAndPitfalls in: /home/jovyan/work/t1_book/content/01 # Loaded extensions: # sphinx.ext.mathjax (4.2.0) from /opt/conda/lib/python3.9/site-packages/sphinx/ext/mathjax.py # sphinxcontrib.applehelp (1.0.2) from /opt/conda/lib/python3.9/site-packages/sphinxcontrib/applehelp/__init__.py # sphinxcontrib.devhelp (1.0.2) from /opt/conda/lib/python3.9/site-packages/sphinxcontrib/devhelp/__init__.py # sphinxcontrib.htmlhelp (2.0.0) from /opt/conda/lib/python3.9/site-packages/sphinxcontrib/htmlhelp/__init__.py # sphinxcontrib.serializinghtml (1.1.5) from /opt/conda/lib/python3.9/site-packages/sphinxcontrib/serializinghtml/__init__.py # sphinxcontrib.qthelp (1.0.3) from /opt/conda/lib/python3.9/site-packages/sphinxcontrib/qthelp/__init__.py # alabaster (0.7.12) from /opt/conda/lib/python3.9/site-packages/alabaster/__init__.py # sphinx_togglebutton (0.2.3) from /opt/conda/lib/python3.9/site-packages/sphinx_togglebutton/__init__.py # sphinx_copybutton (0.4.0) from /opt/conda/lib/python3.9/site-packages/sphinx_copybutton/__init__.py # myst_nb (0.13.1) from /opt/conda/lib/python3.9/site-packages/myst_nb/__init__.py # jupyter_book (0.12.0) from /opt/conda/lib/python3.9/site-packages/jupyter_book/__init__.py # sphinx_thebe (0.0.10) from /opt/conda/lib/python3.9/site-packages/sphinx_thebe/__init__.py # sphinx_comments (0.0.3) from /opt/conda/lib/python3.9/site-packages/sphinx_comments/__init__.py # sphinx_external_toc (0.2.3) from /opt/conda/lib/python3.9/site-packages/sphinx_external_toc/__init__.py # sphinx.ext.intersphinx (4.2.0) from /opt/conda/lib/python3.9/site-packages/sphinx/ext/intersphinx.py # sphinx_panels (0.6.0) from /opt/conda/lib/python3.9/site-packages/sphinx_panels/__init__.py # sphinx_book_theme (unknown version) from /opt/conda/lib/python3.9/site-packages/sphinx_book_theme/__init__.py # sphinx_jupyterbook_latex (unknown version) from /opt/conda/lib/python3.9/site-packages/sphinx_jupyterbook_latex/__init__.py # pydata_sphinx_theme (unknown version) from /opt/conda/lib/python3.9/site-packages/pydata_sphinx_theme/__init__.py Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/jupyter_book/sphinx.py", line 167, in build_sphinx app.build(force_all, filenames) File "/opt/conda/lib/python3.9/site-packages/sphinx/application.py", line 343, in build self.builder.build_update() File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 293, in build_update self.build(to_build, File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 307, in build updated_docnames = set(self.read()) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 414, in read self._read_serial(docnames) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 435, in _read_serial self.read_doc(docname) File "/opt/conda/lib/python3.9/site-packages/sphinx/builders/__init__.py", line 475, in read_doc doctree = read_doc(self.app, self.env, self.env.doc2path(docname)) File "/opt/conda/lib/python3.9/site-packages/sphinx/io.py", line 189, in read_doc pub.publish() File "/opt/conda/lib/python3.9/site-packages/docutils/core.py", line 217, in publish self.document = self.reader.read(self.source, self.parser, File "/opt/conda/lib/python3.9/site-packages/sphinx/io.py", line 109, in read self.parse() File "/opt/conda/lib/python3.9/site-packages/docutils/readers/__init__.py", line 78, in parse self.parser.parse(self.input, document) File "/opt/conda/lib/python3.9/site-packages/myst_nb/parser.py", line 84, in parse path_doc = nb_output_to_disc(ntbk, document) File "/opt/conda/lib/python3.9/site-packages/myst_nb/parser.py", line 303, in nb_output_to_disc write_notebook_output(ntbk, str(output_dir), doc_filename) File "/opt/conda/lib/python3.9/site-packages/jupyter_sphinx/execute.py", line 290, in write_notebook_output contents, resources = exporter.from_notebook_node(notebook) File "/opt/conda/lib/python3.9/site-packages/nbconvert/exporters/script.py", line 56, in from_notebook_node return exporter.from_notebook_node(nb, resources, **kw) File "/opt/conda/lib/python3.9/site-packages/sos_notebook/converter.py", line 152, in from_notebook_node resources['output_extension'] = '.sos' TypeError: 'NoneType' object does not support item assignment (base) jovyan@jupyter-qmrlab-2dt1-5fbook-2di83ukryw:~/work/t1_book$ ```

I find it difficult to interpret these logs. The main part seems to be:

  File "/opt/conda/lib/python3.9/site-packages/sos_notebook/converter.py", line 152, in from_notebook_node
    resources['output_extension'] = '.sos'
TypeError: 'NoneType' object does not support item assignment

However, SoS works fine when executing the notebooks inside Jupyter Lab. So my gut tells me that this is an incompatibility between Jupyter Book's jupyter notebook execution kernel setup and SoS. I have to manually add the SoS kernel in my environment during installation (https://github.com/qMRLab/t1_book/blob/058ae1a8be743c4dab14b6f32156084f36071b3d/binder/Dockerfile#L67); I don't know if this added kernel seen by Jupyter Book's environment.

I don't know if this is something I can resolve on my end without hacking JupyterBook's backend.

Open to suggestions.

mathieuboudreau commented 3 years ago

I've opened an issue on Jupyter Book for this problem here https://github.com/executablebooks/jupyter-book/issues/1519.

ltetrel commented 3 years ago

I've opened an issue on Jupyter Book for this problem here executablebooks/jupyter-book#1519.

Hope that this will help :)

In the meantime we can still fallback to pre-executed notebook content stored in repo if we want to list the book on neurolibre ? We can also try to get execution cached content using repo2data, I have a workflow in mind lmk.

mathieuboudreau commented 3 years ago

In the meantime we can still fallback to pre-executed notebook content stored in repo if we want to list the book on neurolibre ?

I've been asking for this since last month, but @pbellec was strongly against I think? Or was it you? Regardless, I've invested time making my repo compatible with the Jupyter Book auto-execution methods I was asked to do, so I'd like those efforts and work hours not have gone to waste, and move in that direction (auto execution) if possible.

ltetrel commented 3 years ago

As I highlighted, this could be a temporary (and faster) solution to have your submission listed on the NeuroLibre website. At this stage if we have no other solutions than using pre-executed content, it is better than nothing I think (regarding also what Nikola raised at our last meeting). But definitively not my choice, just throwing an idea...

mathieuboudreau commented 3 years ago

As I highlighted, this could be a temporary (and faster) solution to have your submission listed on the NeuroLibre website. At this stage if we have no other solutions than using pre-executed content, it is better than nothing I think (regarding also what Nikola raised at our last meeting). But definitively not my choice, just throwing an idea...

I agree - that's why I was suggesting this last month –knowing how difficult the transition to the auto-executed notebook was going to be for this setup– but got denied. I even mentioned it at yesterday's meeting too, if I recall, but no one signalled me that they had changed their minds on allowing this workflow again, so last night and today I worked on converting my repo to being compatible with the required Jupyter Book auto-execute workflow I was asked to do.

@pbellec please chime in, I don't want to spend more time doing unnecessary/unusable work in this issue.

ltetrel commented 3 years ago

Just my opinion: What you are doing is something needed (you don't lose your time...), and publishing a pre-executed content is independent from trying to re-build the book. It does not change the fact that it needs to be reproducible/re-executable at one point. In recap, I am just proposing an alternative, independent path for the renewal.

mathieuboudreau commented 3 years ago

In recap, I am just proposing an alternative, independent path for the renewal.

This submission is no longer needed for the renewal, because it is working here on the old NeuroLibre website: https://www.neurolibre.com.

In recap, I am just proposing an alternative, independent path for the renewal.

As a user, I just wish that you would have been open to this idea last month when I first submitted the book this way, and then warned of the delay that doing it otherwise would cause.

ltetrel commented 3 years ago

This submission is no longer needed for the renewal, because it is working here on the old NeuroLibre website: https://www.neurolibre.com.

Oh ok, my understanding was that it was super urgent because of the renewal!

As a user, I just wish that you would have been open to this idea last month when I first submitted the book this way, and then warned of the delay that doing it otherwise would cause.

Indeed I could not predict the fact that we would face so much issues when trying to re-execute it, I am not aware of the jupyter-book ecosystem... And I did not know that this submission was not re-executed after few months if not years (I actually highlighted the fact that the notebook execution was off) :(

mathieuboudreau commented 3 years ago

The problem was not with how long the notebooks had been executed, it was 1) due to changing the directory structure needed to comply with Neurolibre which caused a bug when adding the necessary paths in the notebooks, and 2) what was needed to get the book building with Jupyter Books auto execute function.

mathieuboudreau commented 3 years ago

Opened an issue on SoS's repo since the Jupyter Book folks are not responding on their end.

https://github.com/vatlab/sos/issues/1447

mathieuboudreau commented 3 years ago

I also fixed the three issues mentionned above,

mathieuboudreau commented 3 years ago

@pbellec here is a screencaptured walkthrough from starting MyBinder to executing every cell in every notebook, in under 20 minutes. https://www.youtube.com/watch?v=_HVnz34UaF0

All images compile and no errors occur.

mathieuboudreau commented 3 years ago

@ltetrel Since having to upgrade nbconvert, Jupyter Book, and SoS to the latest versions, now even building the book with pre-executed notebooks fails. See this screcaptured walkthrough: https://www.youtube.com/watch?v=sLPU1FzBue8

So this shortcut is no longer an option for this submission - there is a bug in either the latest version of Jupyter Book or SoS, or they are simply no longer compatible tools together like they used to be in their older versions when we built this book that is still stable and fully functioning.

ltetrel commented 3 years ago

Did you try an older version of jupyter book? I know this https://github.com/ltetrel/nha2020-nilearn could not build for latest version of jb, so I fall back to 0.10.

pbellec commented 3 years ago

@mathieuboudreau first of all, huge thanks for your patience - I've been under deadlines for a couple of days and only catching up now. The primary value proposition for neurolibre is re-executable content. So I re-iterate that we need to take that submission to a point were we can re-execute the code.

If I understood correctly, at this stage SoS has become incompatible with Jupyter Book, and you opened issues with both projects. There is nothing more we can do. I've noted that the last SoS release was two years ago, so it makes sense that we are running into issue with the recent major iteration of jupyter book. I love the idea of showcasing multiple languages inside one book - but it looks like we are running out of options. Or at least it is out of our control.

If we want to move forward with this rapidly, you could turn the simulation to a stand-alone repo with a dependency on octave only. You can make that repo easily reproducible through a command line. Save the outputs of the simulation on zenodo. And then start the jupyter book from the zenodo data using repo2data (adding link to the octave repo), with python-only dependencies. For heavy processing, this would anyway be the best (and only) approach. In that sense it would be a very useful example for future users.

mathieuboudreau commented 3 years ago

If I understood correctly, at this stage SoS has become incompatible with Jupyter Book

It's unclear to me at the moment if it's SoS that has become incompatible with Jupyter Book, or Jupyter Book that has become incompatible with SoS.

If we want to move forward with this rapidly, you could turn the simulation to a stand-alone repo with a dependency on octave only. You can make that repo easily reproducible through a command line. Save the outputs of the simulation on zenodo. And then start the jupyter book from the zenodo data using repo2data (adding link to the octave repo), with python-only dependencies. For heavy processing, this would anyway be the best (and only) approach. In that sense it would be a very useful example for future users.

If we go this route, I'd rather just save all simulation outputs that are done in Octave, delete all qMRLab code and dependencies, and simply load and plot data in Python. This should reduce the Docker image from about 4 GB to under 1GB, which is something Loic wants to see in submissions. The only disadvantage is that the only things the notebooks would do is load and plot data, and not really do anything else (i.e. users won't be able to reproduce the results/plots or modify them), but at least we could say that the submission works in NeuroLibre just for the sake of saying it does.

pbellec commented 3 years ago

at least we could say that the submission works in NeuroLibre just for the sake of saying it does.

I feel your frustration. I have learned the hard way the tension between fancy features (here, multi-language support) and robustness/maintainability, and have grown to value much more the later. And to re-iterate: jupyter book are simply not a good fit to share long computations. It makes more sense to have a combination of modular, reproducible repos with jupyter book as a last layer for visualizations and relatively light analyses. If the octave repo is well done, it will be an integral part of the work - simply expose it as a submodule, and the work will be fully reproducible. Just not done with a one-size-fit-all technology.

mathieuboudreau commented 3 years ago

My frustration mostly comes from the fact that this book with all its features has been already created and sustained with strong stability for 3 years already, here: http://qmrlab.org/t1_book/intro, and after spending 2-3 months trying to make it compatible with the automated pipeline necessary to publish it in NeuroLibre, I'm asked to "dumb down" my book significantly, which is creating an even greater workload to me, the user, with no additional benefits (since my I'll only desire to share my full-featured book anyways). If NeuroLibre had not been so ambitious with its automation goals, I feel that a set of reviewers could have simply reviewed my already prepared full-featured Jupyter Book and migrated it into your servers in a matter of days/weeks instead of months. I think if external users are asked to delete the heart of their submitted notebooks (simulations/fitting/etc) in order to just plot the data in the future, they might opt-out of submitting to NeuroLibre in order to simply self-publish through GitHub or their websites, as we have done.

pbellec commented 3 years ago

The root cause here is lack of support for SoS in the recent jupyter book. It sucks that we ran into an incompatibility issue, sure. I don't think this calls into question the entire neurolibre workflow.

Another option would be to have one notebook with the octave code (and an octave kernel) and one notebook with the python code (with a python kernel, and starting from a pregenerated data archive).

This way users can re-execute all parts of the analysis, just not chain them like SoS can.

mathieuboudreau commented 3 years ago

@roboneuro generate nl-notebook from branch neurolibre-plots-only

roboneuro commented 3 years ago
Attempting NeuroLibre notebook compilation from custom branch neurolibre-plots-only. Reticulating splines etc...
roboneuro commented 3 years ago

:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:

mathieuboudreau commented 3 years ago

@roboneuro generate pdf from branch neurolibre-plots-only

roboneuro commented 3 years ago
Attempting PDF compilation from custom branch neurolibre-plots-only. Reticulating splines etc...