Open zz-yun opened 7 months ago
Hi,
I'm slightly confused. QUILT in general will throw errors that are sometimes confusing when it runs out of memory.
Each sample in QUILT is processed independently. Can you try running fewer samples at a time, using fewer cores? In an extreme case, run small sets of individuals with 1 core each, then merge back together.
Thanks, Robbie
Please let me know if this keeps happening when using a number of cores / RAM where running out of memory isn't reasonably likely to occur.
Dear I have 290 samples to impute and i run QUILT in chunks.
QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=12800001 --regionEnd=13800001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
I encountered an error:
When I tried to run them using multiple cores(40 cores). I attempted to allocate more memory(200Gb), and I found that ran successfully.
But Similar commands and the same solutions have failed. The command:
QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=13600001 --regionEnd=14600001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
The error always is
Therefore, I tried setting the core number to 1 for the tasks, but encountered the same error. What could be the issue?
Thank you. Any feedback is welcome.