Gleeson-Lab / wxs_pipeline

Starting with BAMs and FASTQs, follow GATK 4.0 Best Practices up to generating a joint-genotyped VCF
1 stars 1 forks source link

Adjust Walltimes Based on Size of Data #17

Open brcopeland opened 2 years ago

brcopeland commented 2 years ago

This can be helpful to prioritize using the glean queue (for free) for smaller data files and avoid jobs being killed for larger data files.

shishenyxx commented 2 years ago

<8 hrs job on glean queue and resubmit for 3 cycles, just in case they got killed ... =>8 hrs job on home queue and increase memory/cores everytime when resubmit ... after maybe 2 resubmit on home report error ... something like that?

brcopeland commented 2 years ago

I think what you say is sensible but I think those are matters for the snakemake profile. For the pipeline itself I want it to have a good estimate of the walltime and leave the matters of the queue, reruns, etc. up to the profile.