Closed grst closed 2 years ago
@alex-d13, please look it through and LMK if anything is unclear.
del
of an unused object in Python? Hi,
Thanks a lot for the help, everything seems to be clear to me.
Yeah the vignettes with sfaira are an issue on itself..i usually tried to pre-compile them locally, then push the compiled pkgdown vignette to its branch manually. Thats a really annoying workflow since I have to it before every push to main, but I dont really know how to solve it differently. The vignette first has to build a conda environment using basilisk and then downloads one example dataset from sfaira. I guess thats where pkgdown has its memory issue. Another reason for this pre-calculation is the time limit in Bioconductor. They give me timeout warnings if I build the vignette on their system. This is the blogpost I am using to precompile the vignettes: https://ropensci.org/blog/2019/12/08/precompute-vignettes/
I see! We can of course somehow exclude the vignette from the CI. The memory limit on GH actions is 7GB, which is not too little.
Maybe a little downsampling or choosing a different example dataset will suffice?
I could try that, let me see if I can find a smaller dataset.
But I don't think I can decrease the runtime, installing the python packages just takes some time. So maybe I can find a solution where pkgdown builds the vignettes and deploys them on one branch (main_deploy) and another branch to push to bioconductor (has to be the main branch i think). But i don't know if thats good 'git' solution :D
If the vignette build runs on github, you could set it up to auto-commt the pre-compiled vignette, then bioconductor woudn't need to build it.
You can take inspiration from the document
action for that
🚀 Deployed on https://632d79519b5e1b6cedc6f23c--simbu-pkgdown-pr-build.netlify.app
TODO