Hope you have been well! I have a couple of samples that that are getting a “segmentation fault” error. I have given plenty of RAM (250Gb) so I don’t believe that is the issue.
Input bps file: /researchers/ahwan.pandey/Projects/WholeGenome/Project_EC/CASCADE/analysis_21_06_2024_Somatic/for_jeremiah/AN_T_66871_2200010_23_N_66871_GL.bps.txt.gz
Analysis id: test
...vcf - reading breakpoints: /researchers/ahwan.pandey/Projects/WholeGenome/Project_EC/CASCADE/analysis_21_06_2024_Somatic/for_jeremiah/AN_T_66871_2200010_23_N_66871_GL.bps.txt.gz
...vcf - read in 1,111,139 indels and 2,481,183 SVs
...vcf - SV deduplicating 2,481,183 events
...deduping at 0
...added 2:33,141,583-33,141,583(+) with 1827839 overlaps to SV-pileup blacklist at dedupe
NB: Any SV with one breakend occuring more than 500 times will be removed. To change this behavior, adjust HIGH_OVERLAPLIMIT define in vcf.cpp and recompile/run
---blacklist 2:33,141,383-33,141,783(+)
...added 2:33,091,892-33,091,892(-) with 538 overlaps to SV-pileup blacklist at dedupe
NB: Any SV with one breakend occuring more than 500 times will be removed. To change this behavior, adjust HIGH_OVERLAPLIMIT define in vcf.cpp and recompile/run
---blacklist 2:33,141,383-33,141,783(+)
---blacklist 2:33,091,692-33,092,092(-)
...deduping at 20,000
...deduping at 40,000
...deduping at 60,000
...deduping at 80,000
...deduping at 100,000
...deduping at 120,000
...deduping at 140,000
...deduping at 160,000
...deduping at 180,000
...deduping at 200,000
...deduping at 220,000
...deduping at 240,000
...deduping at 260,000
...deduping at 280,000
...deduping at 300,000
...deduping at 320,000
...deduping at 340,000
...deduping at 360,000
...deduping at 380,000
...deduping at 400,000
...deduping at 420,000
...deduping at 440,000
...deduping at 460,000
...deduping at 480,000
...deduping at 500,000
...deduping at 520,000
...deduping at 540,000
...deduping at 560,000
...deduping at 580,000
...deduping at 600,000
...deduping at 620,000
...deduping at 640,000
...deduping at 660,000
...deduping at 680,000
...deduping at 700,000
...deduping at 720,000
...deduping at 740,000
...deduping at 760,000
...deduping at 780,000
...deduping at 800,000
Segmentation fault
Hope you have been well! I have a couple of samples that that are getting a “segmentation fault” error. I have given plenty of RAM (250Gb) so I don’t believe that is the issue.
I have uploaded the data here for you to test it out. https://www.dropbox.com/scl/fo/p80pu1gmqb33v14ljg6kx/APAOa3LGMRYn_-9TbsxLTgk?rlkey=ms5n5v67l0gm4c69jnkuijj98&dl=0
I believe it is failing at the “tovcf” part. The command that causes the segfault is:
The error
I am using the following git commit: https://github.com/walaj/svaba/commit/63ffa293cc35cb063af4be5e05bcfb8841a90cc9
Please help!