Open gevro opened 1 year ago
I think the issue is the code at this step is trying to write to a directory that is not accessible via the singularity container. I need to specify available directories to bind to the singularity container.
Do you have a list of all the directories that zUMIs needs access to, other than the input files and the output directory? Including any temp directories.
Hello, Just checking if any idea how to fix this? The issue I think is sqlite trying to write to some temp directory that is not writable when running it as a singularity container: https://github.com/YuLab-SMU/clusterProfiler/issues/441 https://github.com/NRCan/geo-deep-learning/issues/454 https://github.com/nipreps/fmriprep/issues/2525
Hi, Just checking again if you can help with this issue? Thanks!
Hello, Just checking if any idea how to fix this? The issue I think is sqlite trying to write to some temp directory that is not writable when running it as a singularity container: YuLab-SMU/clusterProfiler#441 NRCan/geo-deep-learning#454 nipreps/fmriprep#2525
I know it's been over a year, but also had a similar issue with another package ("database or disk is full" error) that uses sqlite, and while using singularity. So I wanted to followup in case it's helpful for others. I found that adding the --contain and --workdir flags with singularity exec fixed this issue in my case at least. Example run command:
singularity exec \
--cleanenv \
--contain \
--bind run:/run,var-lib-rstudio-server:/var/lib/rstudio-server,database.conf:/etc/rstudio/database.conf \
--bind /path/to/project \
--workdir rstudio_tmp \
mysif.sif \
/usr/lib/rstudio-server/bin/rserver --auth-none=0 --auth-pam-helper-path=pam-helper --server-user=$USER --www-port 1313
https://docs.sylabs.io/guides/3.1/user-guide/cli/singularity_exec.html
Hi, I made a docker with zUMIs, and I got past the prior error, but now getting this error. I'm running it as a singularity image, so I know that disk space is not an issue. Any suggestions?