BenLangmead / bowtie2

A fast and sensitive gapped read aligner
GNU General Public License v3.0
662 stars 158 forks source link

What's the reason for huge difference in memory footprint between Paper and Bowtie2 v2.4.2 ? #366

Open Lilu-guo opened 2 years ago

Lilu-guo commented 2 years ago

With 101bp length illumina sequences, Recorded in Paper's Supplementary Table1, the Peak memory footprint is 3.24 GB, both unpaired and paired. But in practical test using v2.4.2, the Peak memory footprint is 2.669GB for unpaired, and 3.234GB for paired.

Is it the difference of version, or am I using it wrong? Why it use more memory footprint for paired than unpaired?

Lilu-guo commented 2 years ago

Maybe I had found the answer, Offrate is default with 32, But actually 16 was used in Paper's test.