Hello,
I am currently working with a reasonably large dataset of EPIC v1 and v2 arrays (> 100 samples) with my institution's compute cluster, which is slurm-based. The issue that I am running into is that despite having already used sesameDataCache() with an up-to-date ExperimentHub package, I am receiving an error stating that the cache is not present. Is there a reason why this error would occur?
Cache sesame data
sesameDataCacheAll()
Get directory with .idat files
idat_dir <- paste(getwd(), "Redacted", sep = "/")
Get QC data for samples
qcs_detection <- openSesame(idat_dir, prep="", func = sesameQC_calcStats, funs = "detection")
Error: BiocParallel errors
1 remote errors, element index: 1
120 unevaluated and other errors
first remote error:
Error in stopAndCache(title):
| File idatSignature needs to be cached to be used in sesame.
| Please make sure you have updated ExperimentHub and try
| > sesameDataCache()
| to retrieve and cache needed sesame data.
Execution halted
Hello, I am currently working with a reasonably large dataset of EPIC v1 and v2 arrays (> 100 samples) with my institution's compute cluster, which is slurm-based. The issue that I am running into is that despite having already used sesameDataCache() with an up-to-date ExperimentHub package, I am receiving an error stating that the cache is not present. Is there a reason why this error would occur?