Open matinnuhamunada opened 1 week ago
The worker node does not allow downloading files using FTP access, to fix this edit the main Snakefile so that rule ncbi_genome_download
is done on the parent node:
report: "report/workflow.rst"
localrules: all, ncbi_genome_download
...
Seems like some jobs fail because it exceed memory usage limit. Need to fix this in the LSF profile.
For DTU students/staff that wanted to run BGCFlow on the LSF HPC facility, follow these steps:
install miniforge on the login node: https://github.com/conda-forge/miniforge#mambaforge
Get extra space by requesting a scratch dir by emailing the HPC support at support@cc.dtu.dk, https://www.hpc.dtu.dk/?page_id=927
which results to something like this:
2 directories, 0 files (base) ~/drive/bgcflow
linuxsh # go to one of the worker node
conda run -n bgcflow mamba install bioconda::snakemake-executor-plugin-lsf -y
Then run the workflow
to find the right queue, use:
bqueues -u <user id>
17 directories, 19 files (base) ~