wodanaz / Assembling_viruses

0 stars 0 forks source link

Fix out of memory errors when using a conda environment #40

Closed johnbradley closed 3 years ago

johnbradley commented 3 years ago

When using the conda environment some steps are using more memory than the module version. This is resulting in slurm jobs being killed due to running out of memory. Update the pipeline to fix this problem.

johnbradley commented 3 years ago

@wodanaz Is there a directory on HARDAC with some input files that I could use to reproduce this problem? Thanks.

wodanaz commented 3 years ago

The error comes at the time of installing the pipeline as a conda environment.

image

johnbradley commented 3 years ago

OK. This problem is we exceeded the disk quota for your home directory. The fix is to use the lab directory to hold your conda environments. To fix this we can add the envs_dirs option to your .condarc conda config file so conda will use a lab directory to hold environments. For example the following two lines use /data/wraycompute/alejo/conda_envs to hold your conda environments.

envs_dirs:
  - /data/wraycompute/alejo/conda_envs
wodanaz commented 3 years ago

Adding these two lines to .condarc worked.

Thanks! I think it installed it

Thanks

johnbradley commented 3 years ago

Marking this closed. I misunderstood the problem.