Closed jiadong324 closed 1 year ago
How long are your reads? We have not tested COBALT at all with long reads.
I found this study https://www.nature.com/articles/s41587-022-01468-y uses hmftools for long read data, so that I want to have a try.
For my data, the average length for both tumor and normal sample is 15kb. Since it successfully calculate A-tumor
, I think read length might not be the issue.
I was not aware of that study. 15kb is very long since we count the read starts in each 1kb bins so I think it could cause problems, but you are right that it may not be the cause here.
Can you provide your exact command that you ran on the bam please?
Here is the command:
java -jar ~/Biotools/hmftools/cobalt_v1.14.jar -threads 6 -reference A-normal -reference_bam ./A-normal.bam -tumor A-tumor -tumor_bam ./A-tumor.bam -output_dir ./A/cobalt -gc_profile ./GC_profile.1000bp.38.cnp
BTW, I can successfully run Amber with the same data.
Hi, this is a bug in cobalt, I will put out a fix
@hongwingl Thanks! Looking forward to your update.
I made a release that fixes this bug: https://github.com/hartwigmedical/hmftools/releases/tag/cobalt-v1.14.1
@hongwingl Great! Thanks for your help!
@hongwingl It works right now. But I have another error running purple:
failed to load ref genome: org.apache.commons.cli.ParseException: Supplied ref genome must have associated sequence dictionary
What is the sequence dictionary?
It is looking for a .dict file that gives the collection of sequence information of the fasta file. There are many tools you can use to create it. Such as samtools-dict
Thanks! But I also have the same error as mentioned in https://github.com/hartwigmedical/hmftools/issues/405. The GC profile is GC_profile.1000bp.38.cnp
, and the reference file is the same one as I used for the read alignment.
Can I close this? Looks like has been fixed in #405?
Dear developer,
I am running cobalt (v1.14) on tumor-normal paired long read data, and the errors are shown below. Is that due to the low coverage of normal sample (~20X)?
Looking forward to your reply, thanks!