Closed roboneuro closed 1 year ago
Thanks, @agahkarakuzu ! I feel better that it's consistent across both build systems. I'll try to track it down ASAP.
I changed the
intro.md
, you can click the link to see changes. Now I pushed another commit7b4e6e1
for which I started a book build on roboneuro page as @emdupre suggested.
If you don't update the Dockerfile, I think it will use a cached version of the Docker image, and considering that it's inside the image that we clone the GitHub repo, I think that's why you're not seeing your changes reflected after you push.
We've been dealing with this "feature" of Binder/Dockerfiles for years.... very annoying.
I can see that for the Binder build, it used cache for the stage where you clone the repo and checkout the neurolibre branch.
We are sending the book build request to our server with a commit hash. My impression was that we would pull and checkout that specific commit in the environment before the book build. I think this is not happening. Even if the neurolibre branch your Dockerfile checks out is behind the current HEAD, this would solve that issue. See the book build:
It cannot detect any changes.
I think we use new commit hashes as a flag to trigger a new build, but we are not necessarily checking out that commit on the server side. Is this the case @ltetrel ? If so can we ensure that the book build mechanism git pulls and git checkouts the commit that comes with the API request?
With non-dockerimage cases the commit we request coincides with the HEAD most likely. Still, we should not take chances :)
First this repo is the first case for us to build from a different branch (if I remember correctly). I also had impression that it would not build the correct commit, but I already checked few weeks ago and it correctly select the good commit. Indeed, binder does not have understanding of branch, he does see just commit hashes.
In our case I suppose you are building this branch (neurolibre): https://github.com/qMRLab/t1_book/commit/7b4e6e1cc714540b27d646ba94930e3d564f6565 In the server, we can see that the commit hash is indeed correct:
I think what happening here is because of the bad design of the Dockerfile, where it pulls the main branch (so obviously docker has some caching for this part): https://github.com/qMRLab/t1_book/blob/7b4e6e1cc714540b27d646ba94930e3d564f6565/binder/Dockerfile#L67-L68 This is something I already mentioned a few times... Also when you connect to the binder instance, you see that the root repo corresponds to Neurolibre branch:
Also please refactor your folder here.. https://github.com/qMRLab/t1_book/tree/neurolibre/content
It should be clean of any _build
, and why are they requirements and Dockerfile inside it ??
Now your repo is taking over 668MB ! This is not normal.
jovyan@9020d0fd657b:~/work$ du -hs t1_book/
668M t1_book/
To clean the git history (reduce the size of the Docker because of the html files), I would create a new repo to start from a clean foundation.
Ok for Dockerfile.
By removing _build/, Jupyter Book builds will take 2-4 hours on RoboNeuro (simulations take a long time); can you confirm that this is what you want?
Télécharger Outlook pour Androidhttps://aka.ms/AAb9ysg
From: Loic Tetrel @.> Sent: Friday, October 1, 2021 11:39:00 AM To: neurolibre/neurolibre-reviews @.> Cc: Mathieu Boudreau @.>; Mention @.> Subject: Re: [neurolibre/neurolibre-reviews] [PRE REVIEW]: Quantitative T1 MRI (#2)
Also please refactor your folder here.. https://github.com/qMRLab/t1_book/tree/neurolibre/content
It should be clean of any _build, and why are they requirements and Dockerfile inside it ?? Now your repo is taking over 668MB ! This is not normal.
@.***:~/work$ du -hs t1_book/ 668M t1_book/
To clean the git history (reduce the size of the Docker because of the html files), I would create a new repo to start from a clean foundation.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/neurolibre/neurolibre-reviews/issues/2#issuecomment-932288486, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAK25ZJ76VXYJD3455FBKFLUEXBYJANCNFSM5ENCPJTQ. Triage notifications on the go with GitHub Mobile for iOShttps://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Androidhttps://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
It's unreasonable to ask users (not just me) to delete complete git histories for their projects (as bloated as they may be) for NeuroLibre submissions. I think a better solution would be to clone just a single commit.
Télécharger Outlook pour Androidhttps://aka.ms/AAb9ysg
From: Loic Tetrel @.> Sent: Friday, October 1, 2021 11:39:00 AM To: neurolibre/neurolibre-reviews @.> Cc: Mathieu Boudreau @.>; Mention @.> Subject: Re: [neurolibre/neurolibre-reviews] [PRE REVIEW]: Quantitative T1 MRI (#2)
Also please refactor your folder here.. https://github.com/qMRLab/t1_book/tree/neurolibre/content
It should be clean of any _build, and why are they requirements and Dockerfile inside it ?? Now your repo is taking over 668MB ! This is not normal.
@.***:~/work$ du -hs t1_book/ 668M t1_book/
To clean the git history (reduce the size of the Docker because of the html files), I would create a new repo to start from a clean foundation.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/neurolibre/neurolibre-reviews/issues/2#issuecomment-932288486, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAK25ZJ76VXYJD3455FBKFLUEXBYJANCNFSM5ENCPJTQ. Triage notifications on the go with GitHub Mobile for iOShttps://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Androidhttps://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
I understand that it is not "reasonable", but it is these king of optimizations that will make the build faster (so we can faster iterate). Deleting just the _build
folder will not resolve the size (slow build) issue since it will stay in the git history.
So I am proposing this, if we don't want to do that no problem, but please no more complains on build time then :)
I don't recall complaining about the build time, they're similar to the MyBinder performance for my images. The lack of logs displayed to the user during RoboNeuro builds however....
Télécharger Outlook pour Androidhttps://aka.ms/AAb9ysg
From: Loic Tetrel @.> Sent: Friday, October 1, 2021 12:03:18 PM To: neurolibre/neurolibre-reviews @.> Cc: Mathieu Boudreau @.>; Mention @.> Subject: Re: [neurolibre/neurolibre-reviews] [PRE REVIEW]: Quantitative T1 MRI (#2)
I understand that it is not "reasonable", but it is these king of optimizations that will make the build faster (so we can faster iterate). Deleting just the _build folder will not resolve the size (slow build) issue since it will stay in the git history.
So I am proposing this, if we don't want to do that no problem, but please no more complains on build time then :)
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/neurolibre/neurolibre-reviews/issues/2#issuecomment-932308426, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAK25ZJLA4TTGBVKU6V5AQ3UEXETNANCNFSM5ENCPJTQ. Triage notifications on the go with GitHub Mobile for iOShttps://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Androidhttps://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
I don't recall complaining about the build time, they're similar to the MyBinder performance for my images. The lack of logs displayed to the user during RoboNeuro builds however....
You still have the issue with no logs being output by RoboNeuro ? @agahkarakuzu this is really problematic, we already mentioned that few weeks ago. And for the lack of streaming back the live logs (instead of having a single file at the end), if I recall correctly from few months ago this was also a RoboNeuro limitation, not do-able (but already possible on the APi side).
Re logs @ltetrel we are sending back two files to the email, one for binderhub and one for book build if the build is requested from roboneuro. Similarly we can dispatch them to the issue comment in
In the server, we can see that the commit hash is indeed correct:
In the server, on the instance that has been spawned for the book build (the one with all jupyter book and user dependencies), do we have a mechanism that looks like this:
git pull
git checkout "the indeed correct commit hash"
build-book ....
Or is it just the outputs are named with the "indeed correct commit hash", but the book build is performed like:
build-book ...
Even in the bad Docker case, if we have a mechanism to checkout a branch, we should be able to hit the target.
If you are sure that the we have Case 1 in place, then the only possible roadblock would be having another repo (qMRLab
) cloned into the t1_book
repo in the Dockerfile to checkout the "indeed correct commit hash". Below are the full logs:
Cloning into '/tmp/repo2dockerl4yxeev4'...\n
Updating files: 90% (173/191)\r
Updating files: 91% (174/191)\r
Updating files: 92% (176/191)\r
Updating files: 93% (178/191)\r
Updating files: 94% (180/191)\r
Updating files: 95% (182/191)\r
Updating files: 96% (184/191)\r
Updating files: 97% (186/191)\r
Updating files: 98% (188/191)\r
Updating files: 99% (190/191)\r
Updating files: 100% (191/191)\r
Updating files: 100% (191/191), done.\n
HEAD is now at 7b4e6e1 Update intro.md\n
Using DockerBuildPack builder\n
Step 1/6 : FROM jupyter/base-notebook:8ccdfc1da8d5
\n
---> 0f1ec5090952\n
Step 2/6 : USER root
\n
---> Using cache\n
---> 3a6f51f69c36\n
Step 3/6 : RUN apt-get update && apt-get install -y --no-install-recommends build-essential=12.4ubuntu1 emacs git inkscape jed libsm6 libxext-dev libxrender1 lmodern netcat unzip nano curl wget gfortran cmake bsdtar rsync imagemagick gnuplot-x11 libopenblas-base octave liboctave-dev octave-info octave-parallel octave-struct octave-io octave-statistics octave-optim octave-image python3-dev ttf-dejavu && apt-get clean && apt-get autoremove && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
\n
---> Using cache\n
---> e1acc844f56a\n
Step 4/6 : RUN cd $HOME/work; pip install --upgrade pip --ignore-installed certifi; pip install --use-deprecated=legacy-resolver octave_kernel sos==0.17.7 sos-notebook==0.17.2 sos-python==0.9.12.1 sos-bash==0.12.3 sos-matlab==0.9.12.1 sos-ruby==0.9.15.0 sos-sas==0.9.12.3 sos-julia==0.9.12.1 sos-javascript==0.9.12.2 sos-r==0.9.12.2 scipy plotly==3.10.0 flask ipywidgets nbconvert==5.4.0 jupyterlab==2.2.0 jupyter-book==0.10.0 jupytext repo2data; python -m sos_notebook.install; git clone https://github.com/qMRLab/t1_book.git; cd t1_book/; git checkout neurolibre; git clone https://github.com/neuropoly/qMRLab.git; cd qMRLab; git checkout 0e97155a6e310911e575ebd8f8870e5f2988a82b; cd ..; mkdir qMRLab/data; for i in _requirements/*_dataset.json; do repo2data -r \"$i\"; done; chmod -R 777 $HOME/work/t1_book/; octave --eval \"cd qMRLab; startup; pkg list;\"
\n
---> Using cache\n
---> 41add17911a7\n
Step 5/6 : WORKDIR $HOME/work/t1_book
\n
---> Using cache\n
---> e81497bc90b5\n
Step 6/6 : USER $NB_UID
\n
---> Using cache\n
---> f2fa173f13f1\n
{\"aux\": {\"ID\": \"sha256:f2fa173f13f173a20408f440c50d609b95953742680298c84b595bc4d0624441\"}}
[Warning] One or more build-args [NB_UID NB_USER] were not consumed\n
Successfully built f2fa173f13f1\n
Successfully tagged binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-67f2dc:7b4e6e1cc714540b27d646ba94930e3d564f6565\n
Successfully pushed binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-67f2dc:7b4e6e1cc714540b27d646ba94930e3d564f6565
Built image, launching...\n
server running at https://binder.conp.cloud/jupyter/user/qmrlab-t1_book-dwreu4x7/\n", "image": "binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-67f2dc:7b4e6e1cc714540b27d646ba94930e3d564f6565", "repo_url": "https://github.com/qMRLab/t1_book", "token": "sLWEfLJwRiuM0Oe7MzkEJw", "binder_ref_url": "https://github.com/qMRLab/t1_book/tree/7b4e6e1cc714540b27d646ba94930e3d564f6565", "binder_launch_host": "https://binder.conp.cloud/", "binder_request": "v2/gh/qMRLab/t1_book.git/7b4e6e1cc714540b27d646ba94930e3d564f6565", "binder_persistent_request": "v2/gh/qMRLab/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565
WARNING: Unknown key in `_toc.yml`: expand_sections
WARNING: Unknown key in `_toc.yml`: expand_sections
WARNING: multiple files found for the document "01/IR_References": ['01/IR_References.ipynb', '01/IR_References.md']
Use '/home/jovyan/work/t1_book/content/01/IR_References.md' for the build.
/home/jovyan/work/t1_book/content/01/IR_BenefitsAndPitfalls.ipynb:10002: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/01/IR_DataFitting.ipynb:10002: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/01/IR_Introduction.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/01/IR_OtherMethods.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/01/IR_References.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/01/IR_SignalModelling.ipynb:10002: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/02/VFA_BenefitsAndPitfalls.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/02/VFA_DataFitting.ipynb:10002: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/02/VFA_Introduction.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/02/VFA_Introduction.md:: WARNING: image file not readable: 02/images/vfa_pulsesequences.png
/home/jovyan/work/t1_book/content/02/VFA_References.md:1: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
/home/jovyan/work/t1_book/content/02/VFA_SignalModelling.ipynb:10002: WARNING: Non-consecutive header level increase; 0 to 2 [myst.header]
Running Jupyter-Book v0.10.0
Source Folder: /home/jovyan/work/t1_book/content
Config Path: /home/jovyan/work/t1_book/content/_config.yml
Output Path: /home/jovyan/data/book-artifacts/qMRLab/github.com/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565/_build/html
Running Sphinx v3.5.4
making output directory... done
myst v0.13.7: MdParserConfig(renderer='sphinx', commonmark_only=False, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, update_mathjax=True, enable_extensions=['html_image', 'amsmath', 'colon_fence', 'deflist', 'dollarmath', 'html_admonition', 'linkify', 'replacements', 'smartquotes', 'substitution'], disable_syntax=[], url_schemes=['mailto', 'http', 'https'], heading_anchors=None, html_meta=[], footnote_transition=True, substitutions=[], sub_delimiters=['{', '}'])
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 12 source files that are out of date
updating environment: [new config] 12 added, 0 changed, 0 removed
reading sources... [ 8%] 01/IR_BenefitsAndPitfalls
reading sources... [ 16%] 01/IR_DataFitting
reading sources... [ 25%] 01/IR_Introduction
reading sources... [ 33%] 01/IR_OtherMethods
reading sources... [ 41%] 01/IR_References
reading sources... [ 50%] 01/IR_SignalModelling
reading sources... [ 58%] 02/VFA_BenefitsAndPitfalls
reading sources... [ 66%] 02/VFA_DataFitting
reading sources... [ 75%] 02/VFA_Introduction
reading sources... [ 83%] 02/VFA_References
reading sources... [ 91%] 02/VFA_SignalModelling
reading sources... [100%] intro
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [ 8%] 01/IR_BenefitsAndPitfalls
writing output... [ 16%] 01/IR_DataFitting
writing output... [ 25%] 01/IR_Introduction
writing output... [ 33%] 01/IR_OtherMethods
writing output... [ 41%] 01/IR_References
writing output... [ 50%] 01/IR_SignalModelling
writing output... [ 58%] 02/VFA_BenefitsAndPitfalls
writing output... [ 66%] 02/VFA_DataFitting
writing output... [ 75%] 02/VFA_Introduction
writing output... [ 83%] 02/VFA_References
writing output... [ 91%] 02/VFA_SignalModelling
writing output... [100%] intro
generating indices... genindex done
writing additional pages... search done
copying images... [100%] 01/images/ir_pulsesequences.png
copying static files... done
copying extra files... done
dumping search index in English (code: en)... done
dumping object inventory... done
build succeeded, 16 warnings.
The HTML pages are in ../../data/book-artifacts/qMRLab/github.com/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565/_build/html.
===============================================================================
Finished generating HTML for book.
Your book's HTML pages are here:
../../data/book-artifacts/qMRLab/github.com/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565/_build/html/
You can look at your book by opening this file in a browser:
../../data/book-artifacts/qMRLab/github.com/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565/_build/html/index.html
Or paste this line directly into your browser bar:
file:///home/jovyan/data/book-artifacts/qMRLab/github.com/t1_book/7b4e6e1cc714540b27d646ba94930e3d564f6565/_build/html/index.html
===============================================================================
Re logs @ltetrel we are sending back two files to the email, one for binderhub and one for book build if the build is requested from roboneuro. Similarly we can dispatch them to the issue comment in tags (as I did below). So logs are being fetched, nothing to worry about there.
then that is a good news, I am not sure why @mathieuboudreau was saying that he had no logs then ? Maybe he was referring to the RoboNeuro page which was always empty.
We have case 2. https://github.com/neurolibre/terraform-binderhub/blob/c1e47d91ab7405e11076806e48c13759c89786f1/terraform-modules/binderhub/assets/config.yaml#L116 Basically I run the book build command inside the Docker container. This container from the docker image is tagged with the commit hash, and when I looked inside it it was indeed the neurolibre branch that was checkout.
I think we should have Case 1
either way to be on the safe side! Is this possible?
when I looked inside it it was indeed the neurolibre branch that was checkout.
It is the neurolibre branch checked out, but the HEAD of that branch is behind the "indeed correct commit hash", for which we would like to build. Even this is a setback coming from an old version of the the repo (t1_book
) cloned and checked out (neurolibre
older than "current commit hash"), Case 2 would overcome this.
We should not use commit hashes just by a means of artifact management, we should use that info to be 100% sure that we are building what has been requested. Do you agree @ltetrel ?
@ltetrel I've cleared the outputs of all Jupyter Notebooks and removed the _build/
folder in the branch, as well as removed the unused Dockerfile and requirements file; it is now ready for you in commit 6b3405.
@roboneuro generate nl-notebook from branch neurolibre
Attempting NeuroLibre notebook compilation from custom branch neurolibre. Reticulating splines etc...
:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:
Even this is a setback coming from an old version of the the repo (
t1_book
) cloned and checked out (neurolibre
older than "current commit hash"), Case 2 would overcome this.
I am not sure to follow. For me it seems that that mathieu is on purpose changing the commit. If he is testing locally his jupyter-book build, it will build for this specific commit. This is one of the biggest issue with user and Dockerfile, they do what they want on the environment (vs req, where we have more control over this because the environment is more "standard").
I am not agree that we should checkout ourself and build from this (case 1), we should build what the user prepared for us (in that case, https://github.com/qMRLab/t1_book/blob/7b4e6e1cc714540b27d646ba94930e3d564f6565/binder/Dockerfile#L67-L68). And anyway this is even not possible (because we need to be in the user environment), maybe you remember my first version of the build workflow when I was checking out the repo and repo2docker myself (instead of now user docker inside binder-hub pod inside k8s inside VM hahah) ?
:point_right::page_facing_up: View built NeuroLibre Notebook :page_facing_up::point_left:
I forgot to change the config file to rebuild the notebooks, retrying
@roboneuro generate nl-notebook from branch neurolibre
Attempting NeuroLibre notebook compilation from custom branch neurolibre. Reticulating splines etc...
:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:
@roboneuro generate nl-notebook from branch neurolibre
Basically this is just replacing the user build triggering via the web-page right ?
:point_right::page_facing_up: View built NeuroLibre Notebook :page_facing_up::point_left:
Also @mathieuboudreau if you already ask for a build, you need to wait that this first build finishes before requesting for another. Roboneuro should output you a message that there is already a current build in process (as the API does, it returns error 409). It is not possible to interrupt a binder build sadly!
This is one of the biggest issue with user and Dockerfile, they do what they want on the environment (vs req, where we have more control over this because the environment is more "standard").
As I've mentionned before, all members of our lab have previously tried to setup Dockerfile alternative requirements.txt and postbuild files for this book, and did not succeed after weeks of trying due to the special SoS setup. I'd be very happy if you would try yourself @ltetrel and propose a non-Dockerfile solution, and make a pull Request, as you seem to be more knowledgeable with these tools (I am more comfortable with Dockerfile).
Also @mathieuboudreau if you already ask for a build, you need to wait that this first build finishes before requesting for another. Roboneuro should output you a message that there is already a current build in process (as the API does, it returns error 409).
I was not provided this message that RoboNeuro should have outputed. I thought when RoboNeuro sent the message "View built NeuroLibre Notebook" that it meant it had completed the build process.
BTW the latest book incorporates the most up-to-date changes:
Looks like whatever happened for 6b34053ee0ad1ffb3f1a7a8d478cc4f60920658f
can be a solution for such cases.
As I've stated on top, I'm 99.999999999% sure that your changes weren't incorporated because you didn't modify the Dockerfile and force a rebuild, as a rebuild would have cloned the most up to date changes in the repo (inside the dockerfile is where the repo is cloned)
As I've mentionned before, all members of our lab have previously tried to setup Dockerfile alternative requirements.txt and postbuild files for this book, and did not succeed after weeks of trying due to the special SoS setup. I'd be very happy if you would try yourself @ltetrel and propose a non-Dockerfile solution, and make a pull Request, as you seem to be more knowledgeable with these tools (I am more comfortable with Dockerfile).
@mathieuboudreau Oh no you got me wrong here! I totally understand that in your case you don't have choice no worries (indeed for Octave it seems that there is no other choice). I am just saying that this is one example of a case when things can go wrong if the user start to change the internal layout of the repo (like git clone, checkout etc..)
Ok so I'm all confused now. I've cleared to my Jupyter Book notebooks and such; how to I command RoboNeuro to rebuild while re-executing these notebooks? Is there some kind of queue I need to go check before I can resubmit a request? Can I view that queue somewhere?
I've switched "execute notebooks" from off to on in the config file already:
Hmm I set it to "on" because before it was at "off", but now I checked the Jupyter Book documentation and these aren't valid options for the latest version of Jupyter Book. However, I'm using an older version. I can't seem to find a way to revert their documentation page to the version I'm using to see if the options have changed (and my options would be valid for my current version) or if they were always the same (and thus I should use force
).
I'll try force.
@mathieuboudreau Again this is clearly an issue if RoboNeuro does not output you that there is already a build in process (when the API does). There is no possibility to cancel a binder build, that is why you need to wait for the previous build to finish (or fail). There is no queue or whatever:
@roboneuro generate nl-notebook from branch neurolibre
Attempting NeuroLibre notebook compilation from custom branch neurolibre. Reticulating splines etc...
:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:
@mathieuboudreau Again this is clearly an issue if RoboNeuro does not output you that there is already a build in process (when the API does). There is no possibility to cancel a binder build, that is why you need to wait for the previous build to finish (or fail). There is no queue or whatever:
- You ask for a (binder) build a. there is already a current build for this repository (even if commit changed) => request ignored b. There is no build for this repository => build triggered
Ok, so if it skips when there's already a build, then it shouldn't be a problem to try if I didn't get a notice.
Ok, so if it skips when there's already a build, then it shouldn't be a problem to try if I didn't get a notice.
Yep, now it is building something from 112s ago:
build-qmrlab-2dt1-5fbook-54ed4d-330-28 1/1 Running 0 112s
commit 330da581e4154314d693c61648bdc7cfcf3a676a
We ran into a problem building your book. :(
I just tried on RoboNeuro, and it "succedded": http://neurolibre-data.conp.cloud/book-artifacts/qmrlab/github.com/t1_book/330da581e4154314d693c61648bdc7cfcf3a676a/_build/html/
So actually it seems that there are some issues with your build, as RoboNeuro just told you: http://neurolibre-data.conp.cloud/book-artifacts/qmrlab/github.com/t1_book/330da581e4154314d693c61648bdc7cfcf3a676a/book-build.log It seems that this is the issue we encounter when trying to run on the binder instance.
Here's the log, I have no idea what the errors mean.
WARNING: Unknown key in `_toc.yml`: expand_sections
WARNING: Unknown key in `_toc.yml`: expand_sections
WARNING: multiple files found for the document "01/IR_References": ['01/IR_References.ipynb', '01/IR_References.md']
Use '/home/jovyan/work/t1_book/content/01/IR_References.md' for the build.
Exception occurred:
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 251, in _kernel_manager_class_default
from jupyter_client import AsyncKernelManager
ImportError: cannot import name 'AsyncKernelManager'
The full traceback has been saved in /tmp/sphinx-err-f6y1zkwb.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
Traceback (most recent call last):
File "/opt/conda/lib/python3.6/site-packages/traitlets/traitlets.py", line 528, in get
value = obj._trait_values[self.name]
KeyError: 'kernel_manager_class'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/lib/python3.6/site-packages/jupyter_book/sphinx.py", line 150, in build_sphinx
app.build(force_all, filenames)
File "/opt/conda/lib/python3.6/site-packages/sphinx/application.py", line 352, in build
self.builder.build_update()
File "/opt/conda/lib/python3.6/site-packages/sphinx/builders/__init__.py", line 298, in build_update
len(to_build))
File "/opt/conda/lib/python3.6/site-packages/sphinx/builders/__init__.py", line 310, in build
updated_docnames = set(self.read())
File "/opt/conda/lib/python3.6/site-packages/sphinx/builders/__init__.py", line 417, in read
self._read_serial(docnames)
File "/opt/conda/lib/python3.6/site-packages/sphinx/builders/__init__.py", line 438, in _read_serial
self.read_doc(docname)
File "/opt/conda/lib/python3.6/site-packages/sphinx/builders/__init__.py", line 478, in read_doc
doctree = read_doc(self.app, self.env, self.env.doc2path(docname))
File "/opt/conda/lib/python3.6/site-packages/sphinx/io.py", line 221, in read_doc
pub.publish()
File "/opt/conda/lib/python3.6/site-packages/docutils/core.py", line 218, in publish
self.settings)
File "/opt/conda/lib/python3.6/site-packages/sphinx/io.py", line 126, in read
self.parse()
File "/opt/conda/lib/python3.6/site-packages/docutils/readers/__init__.py", line 78, in parse
self.parser.parse(self.input, document)
File "/opt/conda/lib/python3.6/site-packages/myst_nb/parser.py", line 76, in parse
self.env, ntbk, show_traceback=self.env.config["execution_show_tb"]
File "/opt/conda/lib/python3.6/site-packages/myst_nb/execution.py", line 142, in generate_notebook_outputs
allow_errors=env.config["execution_allow_errors"],
File "/opt/conda/lib/python3.6/site-packages/jupyter_cache/executors/utils.py", line 56, in single_nb_execution
record_timing=False,
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 1117, in execute
return NotebookClient(nb=nb, resources=resources, km=km, **kwargs).execute()
File "/opt/conda/lib/python3.6/site-packages/nbclient/util.py", line 78, in wrapped
return just_run(coro(*args, **kwargs))
File "/opt/conda/lib/python3.6/site-packages/nbclient/util.py", line 57, in just_run
return loop.run_until_complete(coro)
File "/opt/conda/lib/python3.6/asyncio/base_events.py", line 468, in run_until_complete
return future.result()
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 542, in async_execute
async with self.async_setup_kernel(**kwargs):
File "/opt/conda/lib/python3.6/site-packages/async_generator/_util.py", line 34, in __aenter__
return await self._agen.asend(None)
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 480, in async_setup_kernel
self.km = self.create_kernel_manager()
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 359, in create_kernel_manager
self.km = self.kernel_manager_class(kernel_name=self.kernel_name, config=self.config)
File "/opt/conda/lib/python3.6/site-packages/traitlets/traitlets.py", line 556, in __get__
return self.get(obj, cls)
File "/opt/conda/lib/python3.6/site-packages/traitlets/traitlets.py", line 535, in get
value = self._validate(obj, dynamic_default())
File "/opt/conda/lib/python3.6/site-packages/nbclient/client.py", line 251, in _kernel_manager_class_default
from jupyter_client import AsyncKernelManager
ImportError: cannot import name 'AsyncKernelManager'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/conda/bin/jupyter-book", line 8, in <module>
sys.exit(main())
File "/opt/conda/lib/python3.6/site-packages/click/core.py", line 1137, in __call__
return self.main(*args, **kwargs)
File "/opt/conda/lib/python3.6/site-packages/click/core.py", line 1062, in main
rv = self.invoke(ctx)
File "/opt/conda/lib/python3.6/site-packages/click/core.py", line 1668, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/opt/conda/lib/python3.6/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/opt/conda/lib/python3.6/site-packages/click/core.py", line 763, in invoke
return __callback(*args, **kwargs)
File "/opt/conda/lib/python3.6/site-packages/jupyter_book/commands/__init__.py", line 305, in build
result, builder, OUTPUT_PATH, build_type, PAGE_NAME, click.echo
File "/opt/conda/lib/python3.6/site-packages/jupyter_book/commands/__init__.py", line 546, in builder_specific_actions
raise RuntimeError(_message_box(msg, color="red", doprint=False)) from result
RuntimeError:
===============================================================================
There was an error in building your book. Look above for the cause.
===============================================================================
Running Jupyter-Book v0.10.0
Source Folder: /home/jovyan/work/t1_book/content
Config Path: /home/jovyan/work/t1_book/content/_config.yml
Output Path: /home/jovyan/data/book-artifacts/qmrlab/github.com/t1_book/330da581e4154314d693c61648bdc7cfcf3a676a/_build/html
Running Sphinx v3.5.4
making output directory... done
myst v0.13.7: MdParserConfig(renderer='sphinx', commonmark_only=False, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, update_mathjax=True, enable_extensions=['html_image', 'amsmath', 'colon_fence', 'deflist', 'dollarmath', 'html_admonition', 'linkify', 'replacements', 'smartquotes', 'substitution'], disable_syntax=[], url_schemes=['mailto', 'http', 'https'], heading_anchors=None, html_meta=[], footnote_transition=True, substitutions=[], sub_delimiters=['{', '}'])
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 12 source files that are out of date
updating environment: [new config] 12 added, 0 changed, 0 removed
reading sources... [ 8%] 01/IR_BenefitsAndPitfalls
Executing: 01/IR_BenefitsAndPitfalls in: /home/jovyan/work/t1_book/content/01
Specifically, I never call AsyncKernelManager
in my Notebooks.
Did you try locally to build the book as I propose few weeks ago ? (running jupyter book inside your docker built image)
Are we sure that Jupyter Book executes the notebooks inside my Docker image?
Yes, I've tried (separately) to 1- build the book inside my image and 2- execute my notebooks, and both worked in isolation. I think this is the first time notebooks are trying to execute during the jupyter book build
Are we sure that Jupyter Book executes the notebooks inside my Docker image?
And here is the log from the k8s pod:
Running jupyter-book build :
docker run -v /DATA:/home/jovyan/data binder-registry.conp.cloud/binder-registry.conp.cloud/binder-qmrlab-2dt1-5fbook-54ed4d:330da581e4154314d693c61648bdc7cfcf3a676a jupyter-book build --path-output /home/jovyan/data/book-artifacts/qmrlab/github.com/t1_book/330da581e4154314d693c61648bdc7cfcf3a676a content
Yes, I've tried (separately) to 1- build the book inside my image and 2- execute my notebooks, and both worked in isolation. I think this is the first time notebooks are trying to execute during the jupyter book build
Yes indeed, because before it was using the cached content!
Submitting author: !--author-handle-->@mathieuboudreau<!--end-author-handle-- (Mathieu Boudreau) Repository: https://github.com/qMRLab/t1-book-neurolibre Branch with paper.md (empty if default branch): main Version: v1.0.0 Editor: !--editor-->@agahkarakuzu<!--end-editor-- Reviewers: !--reviewers-list-->@agahkarakuzu<!--end-reviewers-list-- Reproducible preprint: Pending Repository archive: Pending Data archive: Pending Book archive: Pending Docker archive: Pending
Author instructions
Thanks for submitting your paper to NeuroLibre @mathieuboudreau.
@mathieuboudreau if you have any suggestions for potential reviewers then please mention them here in this thread (without tagging them with an @). In addition, this list of people have already agreed to review for NeuroLibre and may be suitable for this submission (please start at the bottom of the list).
Editor instructions
The NeuroLibre submission bot @roboneuro is here to help you find and assign reviewers and start the main review. To find out what @roboneuro can do for you type: