Closed cohenp05 closed 4 years ago
Hi @cohenp05 !
It seems like trying to run it manually might be our first priority to try to find out what's going wrong.
Unfortunately, you can't directly pass files into a singularity or docker container. You need to mount the folder into the container, and then specify the path of the dataset in the new location. THe nice thing is that it should simplify your command a little:
singularity run -B /hpc/users/cohenp05/Dyno_Objects:/mount /hpc/users/cohenp05/Comp1_Container/ti_comp1_latest.sif --dataset=/mount/Monocyte_Dyno.h5 --output=/mount/Monocyte_Comp1.h5 --dimred=umap
Also, if you look at singularity run /hpc/users/cohenp05/Comp1_Container/ti_comp1_latest.sif --help
, it should let you know that the dimred parameter can only be used as a string specifying which dimred method to use -- you can't provide your own dimred this way.
Could you verify whether running it manually succeeds?
Robrecht
Closing this issue. Feel free to reply if further input is needed.
Hi all,
Thanks for an amazing package. Forgive me as I'm new to using containers, so I'm having some trouble getting the singularity containers to run on our scientific computing cluster. I have pulled all the TI method images to my local machine and then uploaded them to the cluster, but I can't get singularity to recognize that directory as the cache directory to run out of. When I use the babelwhale call as suggested (like below)
and point the singularity cache_dir to my directory, I get this error:
Here is my sessionInfo:
I've also tried running the containers in a bash script by exporting my dynverse object and a dimensional reduction object to h5 files and running singularity run directly on the image like so:
And then I get this error
Please help! I'm out of ideas.