Closed kswarts closed 10 years ago
Hi Kelly, thanks for your input! I'm not sure how we can easily get around this error: apparently a file of 140GB is so big that sharding it creates more temporary files than you're allowed to have open. You could try making larger shards...try running format_sra with the flag -num 250.
That worked, thanks!
—— Kelly Swarts Ph.D Candidate Buckler Laboratory Department of Plant Breeding and Genetics Cornell University
On Jul 22, 2014, at 11:17 AM, Daisie Huang notifications@github.com wrote:
Hi Kelly, thanks for your input! I'm not sure how we can easily get around this error: apparently a file of 140GB is so big that sharding it creates more temporary files than you're allowed to have open. You could try making larger shards...try running format_sra with the flag -num 250.
— Reply to this email directly or view it on GitHub.
Great! Closing the issue.
Hi, I really like your pipeline and it's working very well for the most part. I'm having a problem when the input file is very large though. The error is:
Error in tempfile() using /workdir/NAMFlowcells/aTRAM/aTRAMDB/Oh43/Oh43.sorted.254.2.XXXX: Could not create temp file /workdir/NAMFlowcells/aTRAM/aTRAMDB/Oh43/Oh43.sorted.254.2.y6j2: Too many open files at /home/kls283/Documents/aTRAM/format_sra.pl line 129.
This only happens when the input file is very large.
Thanks very much!
Kelly