I'm trying to use snapATAC on our (pretty old?) compute platform. First of all it should be said that it doesn't support HDF5 file locking due to lustre implementation. This, in the past, has not been a problem as I usually export the env variable
$ export HDF5_USE_FILE_LOCKING=FALSE
this works when dealing with hdf5 files from scanpy and even snaptools (i.e. I'm able to read/write snap files in python).
Whenever I try to initialize an object with SnapATAC I get this:
To make SnapATAC work I have to initially import snap files on another system and save rds object back to the cluster. I can process data up to the clustering part but other steps are not possible (e.g. running MACS doesn't work as it requires access to the original snap file from R).
The whole thing may be related to rhdf5 and Rhdf5lib, yet I would like to debug this from snapatac.
In reality, rhdf5 seems able to access the file properly:
I'm trying to use snapATAC on our (pretty old?) compute platform. First of all it should be said that it doesn't support HDF5 file locking due to lustre implementation. This, in the past, has not been a problem as I usually export the env variable
this works when dealing with hdf5 files from scanpy and even snaptools (i.e. I'm able to read/write snap files in python). Whenever I try to initialize an object with SnapATAC I get this:
To make SnapATAC work I have to initially import snap files on another system and save rds object back to the cluster. I can process data up to the clustering part but other steps are not possible (e.g. running MACS doesn't work as it requires access to the original snap file from R). The whole thing may be related to rhdf5 and Rhdf5lib, yet I would like to debug this from snapatac.
In reality, rhdf5 seems able to access the file properly:
and
but then it fails.