Closed johnbradley closed 3 years ago
@wodanaz Is there a directory on HARDAC with some input files that I could use to reproduce this problem? Thanks.
The error comes at the time of installing the pipeline as a conda environment.
OK. This problem is we exceeded the disk quota for your home directory. The fix is to use the lab directory to hold your conda environments. To fix this we can add the envs_dirs
option to your .condarc
conda config file so conda will use a lab directory to hold environments.
For example the following two lines use /data/wraycompute/alejo/conda_envs
to hold your conda environments.
envs_dirs:
- /data/wraycompute/alejo/conda_envs
Adding these two lines to .condarc worked.
Thanks! I think it installed it
Thanks
Marking this closed. I misunderstood the problem.
When using the conda environment some steps are using more memory than the module version. This is resulting in slurm jobs being killed due to running out of memory. Update the pipeline to fix this problem.