Closed roboneuro closed 2 years ago
I decided to not run the conversion at all, and just loaded the pre-generated Dataset files. Hopefully that will resolve the memory issues. I resubmitted to RoboNeuro, but I can switch to using https://binder.conp.cloud now, @ltetrel, if you'd prefer.
Yes please use https://binder.conp.cloud, that way it will be easier to debug and fix.
Okay, I've submitted it: https://binder.conp.cloud/v2/gh/NBCLab/nimare-paper/master. I guess the next step is to wait until it finishes building. Or fails, I suppose.
The idea is to run the notebooks on the binder once it is built, so it is easier to fix on the fly. Rather than trying and pushing a fix on another environment (the HPC you are using).
It looks like it times out, even with a 2 hour limit:
Server requested
2022-01-18T17:43:35.111992Z [Normal] Successfully assigned binderhub/jupyter-nbclab-2dnimare-2dpaper-2dirw0glm7 to neurolibre-test-node1
2022-01-18T17:43:39Z [Normal] Container image "jupyterhub/k8s-network-tools:1.1.2" already present on machine
2022-01-18T17:43:40Z [Normal] Created container block-cloud-metadata
2022-01-18T17:43:41Z [Normal] Started container block-cloud-metadata
2022-01-18T17:43:42Z [Normal] Container image "binder-registry.conp.cloud/binder-registry.conp.cloud/binder-nbclab-2dnimare-2dpaper-3e379f:7910ae57ae5ddd4d5b9bd260936336bd30f01783" already present on machine
2022-01-18T17:43:43Z [Normal] Created container notebook
2022-01-18T17:43:44Z [Normal] Started container notebook
Spawn failed: pod binderhub/jupyter-nbclab-2dnimare-2dpaper-2dirw0glm7 did not start in 7200 seconds!
Launch attempt 1 failed, retrying...
Server requested
2022-01-18T19:36:10.114971Z [Normal] Successfully assigned binderhub/jupyter-nbclab-2dnimare-2dpaper-2dfjiwi8x3 to neurolibre-test-node1
2022-01-18T19:36:14Z [Normal] Container image "jupyterhub/k8s-network-tools:1.1.2" already present on machine
2022-01-18T19:36:15Z [Normal] Created container block-cloud-metadata
2022-01-18T19:36:17Z [Normal] Started container block-cloud-metadata
2022-01-18T19:36:18Z [Normal] Container image "binder-registry.conp.cloud/binder-registry.conp.cloud/binder-nbclab-2dnimare-2dpaper-3e379f:7910ae57ae5ddd4d5b9bd260936336bd30f01783" already present on machine
2022-01-18T19:36:19Z [Normal] Created container notebook
2022-01-18T19:36:20Z [Normal] Started container notebook
Unfortunately, I can't tell what pages are problematic.
from the logs it seems to be
executing outdated notebooks... Executing: /home/jovyan/content/09_subtraction.md Executing: /home/jovyan/content/05_cbma.md
Alright, I added a new mechanism to bypass jb build for debugging purpose only.
@tsalo you need to add --neurolibre-debug
in your latest commit message, followed by anything else (it matches neurolibre-debug
sequence) example here. This will allow you to have access to the binder environment without waiting for jb build to finish.
that way you should be able to run and test your notebooks, and make sure it works fine within our time limits of 1h per notebook and 10 hours maximum in total (not 2h per notebook anymore).
Hello @tsalo did you have time to debug your submission on our binder instance ?
It is really important to be strict on the memory usage, for example you may want to limit the memory here.
Just tried to reduce it up to 500mb
it is still crashing.
I don't understand why notebook does not catch this error and keeps running, maybe because the process is multithreaded...
Hi @ltetrel. Unfortunately, I'm pretty buried with dissertation stuff at the moment, so I haven't had time to work on debugging this. Hopefully I'll be able to work on it next week. Sorry for the delay!
No need to be sorry! Take your time and good luck with your dissertation.
I tried using memory_limit='100mb'
, n_iters=500
(down from 2500), and n_cores=1
(so it's not multithreaded), but the build still crashed on SCALE. I can pre-generate the outputs for this step and just share the code that would be run to create them, like I've done for other memory-intensive processes, but I'm concerned about doing that too often. WDYT?
Hi @tsalo,
We agreed to push the limit to 8G of RAM, even if this is a preprint service. Do you mind re-trying execution now and letting me know ?
Sorry for the delay. I discovered some inefficiencies in the SCALE code, so I improved it a bit and made a new NiMARE release. The book successfully built on the CONP Binder server!
@tsalo, right now the compilation step is bypassed with the --neurolibre-debug
flag.
Were you able to execute all your notebooks within the neurolibre environment ? Only when you are sure all your notebooks run, then you can re-submit without the flag.
I ran the whole build from a terminal in the interactive session, so I think it's good to go. I can push a commit without the flag now.
Alright make a commit and I will ask a build here :)
Done! Thanks!
@roboneuro generate nl-notebook
:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:
I ran the whole build from a terminal in the interactive session, so I think it's good to go. I can push a commit without the flag now.
hmm this is actually a good idea, I will keep that in mind for our documentation :)
Hmm I think roboneuro is stalled, I will take a look manually
@roboneuro generate nl-notebook
:seedling: We are currently building your NeuroLibre notebook! Good things take time :seedling:
We ran into a problem building your book. :(
["Found built imag", "server running at https://binder.conp.cloud/jupyter/user/nbclab-nimare-paper-x8osdzpd/\\n\", \"image\": \"binder-registry.conp.cloud/binder-registry.conp.cloud/binder-nbclab-2dnimare-2dpaper-3e379f:a729a098a49409b5c518ea8003c3653b0244927b\", \"repo_url\": \"https://github.com/NBCLab/nimare-paper\", \"token\": \"qI26wRbpQjan8JX9iJUIjQ\", \"binder_ref_url\": \"https://github.com/NBCLab/nimare-paper/tree/a729a098a49409b5c518ea8003c3653b0244927b\", \"binder_launch_host\": \"https://binder.conp.cloud/\", \"binder_request\": \"v2/gh/NBCLab/nimare-paper.git/a729a098a49409b5c518ea8003c3653b0244927b\", \"binder_persistent_request\": \"v2/gh/NBCLab/nimare-paper/a729a098a49409b5c518ea8003c3653b0244927b"]
@roboneuro generate nl-notebook
:point_right::page_facing_up: View built NeuroLibre Notebook :page_facing_up::point_left:
@tsalo I went through all the submission and outputs seems ok. If you can confirm on your side as well and we will be good to proceed with submission!
The outputs look good to me. Thanks!
Hi @tsalo,
Sorry there has been some delays because we are still finalizing the submission workflow. In the meantime, you can test this permanent production binder link for your paper: https://binder-mcgill.conp.cloud/v2/gh/roboneurolibre/nimare-paper/HEAD. Can we push the static content manually to the website @agahkarakuzu @emdupre ? http://neurolibre-data.conp.cloud/book-artifacts/NBCLab/github.com/nimare-paper/a729a098a49409b5c518ea8003c3653b0244927b/_build/html/00_abstract.html We still don't have the DOI.
An immediate solution would be to use the test server for static, and the production server for live. This is for review purposes, and should work fine. We are weeks away from completing the workflow, including DOIs, so you will be able to update the final URLs to the paper before it is accepted.
Thanks @ltetrel and @pbellec! I ended up using the test server link for my Aperture submission.
fingers crossed for the submission. I would be very curious to hear if the reviewers end up using the reproducible aspects of the notebooks. Test server should work just fine for this.
Thanks @ltetrel and @pbellec! I ended up using the test server link for my Aperture submission.
Please use the production link, because we increased the RAM we can handle very few users. And it is not supposed to be used to serve a submission, but rather for building.
@ltetrel sorry, I must have misunderstood what you said before. I just linked to the HTML files, not the Binder.
Ok, the static content (html files) are fine.
Is there any update on the DOI? @angielaird would like to cite the preprint in a manuscript.
I've just received the authorization, I will start working on it soon!
Thanks @agahkarakuzu!
@roboneuro start production
I'm sorry dear human, I don't understand that. You can see what commands I support by typing:
@roboneuro commands
@roboneuro production start
:zap: :zap: :zap: Starting NeuroLibre production process. These are the steps I'll try to complete:
@roboneuro production start
:zap: :zap: :zap: Starting NeuroLibre production process. These are the steps I'll try to complete:
@roboneuro production start
:zap: :zap: :zap: Starting NeuroLibre production process. These are the steps I'll try to complete:
:zap: We are currently building your NeuroLibre notebook for production! :twisted_rightwards_arrows: (fork)
@roboneuro production start
:zap: :zap: :zap: Starting NeuroLibre production process. These are the steps I'll try to complete:
Submitting author: @tsalo (Taylor Salo) Repository: https://github.com/NBCLab/nimare-paper Editor: @pbellec Reviewers: @agahkarakuzu Jupyter Book: http://neurolibre-data-prod.conp.cloud/book-artifacts/roboneurolibre/github.com/nimare-paper/0195c842c8b6f5d42150df814500a9aadb05cc75/_build/html/ Repository archive: 10.5281/zenodo.6624793 Data archive: 10.5281/zenodo.6624795 Book archive: 10.5281/zenodo.6624790 Docker archive: 10.5281/zenodo.6624797
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@ltetrel, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @pbellec know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Review checklist for @ltetrel
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper