Closed tsalo closed 2 years ago
Thanks @tsalo I will look into it,
While I was working on the cluster, I forgot to upgrade with the latest version that is why repo2data
was failing.
Can you check again now ? @tsalo
Will do. Before I submit, though, I wanted to check about whether I should put the destination as ./../data
or ./data
. This example uses ./../data
, while we used ./data
in the OHBM 2021 NiMARE tutorial here, and that seemed to work just fine, but we ran it several months ago.
EDIT: I went with ./data
. 🤞
Ok sorry if the doc is not clear on that topic, explaining this requires not straightforward thinking because user needs to "project himself" in another environment.
You can find an example here that make use of the repo2data
api and which is much more cleaner:
https://github.com/SIMEXP/Repo2Data/issues/16
In the past the build would "succeed", though the executed code within the book would be full of exceptions. Now, it looks like the build failed (https://roboneuro.herokuapp.com/preview?id=4fde2561a01885b60dbcdb14).
The error is:
Your preprint build failed to compile with the following errors:
{}
I'm not sure if the problem is with my book's code or if it's still an issue with the data, so I figured I'd post about it here in case it's the latter.
Ok this is a common issue that we have with the frontend, when the log is not properly output to the user... Can you post that here https://github.com/neurolibre/roboneuro ? Sorry for all the posting ^^'
In the meantime, do you have the logs by email ?
I've opened https://github.com/neurolibre/roboneuro/issues/21. I haven't received an email with the log for this build attempt, though I did receive them for the ones I submitted in the past (before the repo2data
update).
Also there is a 15min retry timeout that frontend should be able to get (but they are not), so maybe re-try in 15 min ? (again this will be added in the doc)
I tried resubmitting, but it still failed without a log.
I will check the build pod for repo2data logs
Alright, there seem to be an issue. I think it is due to the fact that jupyterhub version doe snot match anymore with the current state of the cluster. I will investigate that and will let you know @tsalo
Thanks @ltetrel! Please let me know if you need anything from me.
ATM no I am able to reproduce your issue,
thanks for the report!
@tsalo it should be working now.
repo2data
in your req file https://github.com/NBCLab/nimare-paper/blob/master/binder/requirements.txt, because now I use user pod to run repo2data.repo2data
api to properly get the data path as in:
https://github.com/neurolibre/repo2data-caching/blob/master/notebooks/nilearn-example.ipynbI added repo2data
to the requirements and incorporated the fetching code into the relevant pieces of the repo, but there seem to be two problems:
repo2data
will try to access the data repository even when there's a copy of the data available locally. I can include if/else statements throughout the book to work around this, I guess, but it would make the code less readable.I increased the timeout to 1h, but it seems that it is taking a long time to download your data I am not sure why. Now I see two processes which are still downloading the data:
jupyter-nbclab-2dnimare-2dpaper-2debc1w6bw 0/1 PodInitializing 0 40m neurolibre-test-node1 2021-11-11T14:16:47Z
jupyter-nbclab-2dnimare-2dpaper-2df8wbk9iz 0/1 PodInitializing 0 51m neurolibre-test-node1 2021-11-11T14:06:27Z
How much time do you need to download it ? I also just spawn a new one, so I would suggest to check back in few hours, it may be a network or internet traffic jam on compute canada.
Also what is the execution time of the jupyter book build ?
I see the data downloaded, do you still have issues @tsalo ?
The build ended up failing without a log again. I'm not sure why, if the data did download (which is definitely progress!).
Also what is the execution time of the jupyter book build ?
Sorry, I didn't notice this question. It can take like 8 hours on my university's HPC.
Ok I will close this one ecause the data download is resolved, you can create a new issue for the jupyter-book execution.
I am attempting to use RoboNeuro to test build a book I plan to submit to NeuroLibre. I believe that the data_requirement file is correctly configured, but my book build fails because the data are not located where I expect it to be. I don't know if the data are downloading to a different location, or if the download step is failing completely.
The repository is
NBCLab/nimare-paper
.The dataset should be ~391MB, so I think it's within NeuroLibre's restrictions. It's stored on Google Drive, but I linked the Drive folder to an OSF repository so I could use osfclient for downloading.
Original issue: https://github.com/NBCLab/nimare-paper/issues/22. @ltetrel recommended that I open an issue here.