Maggi-Chen / FusionSeeker

A gene fusion caller for long-read transcriptome sequencing data.
MIT License
16 stars 4 forks source link

raw_signal.py reads record which doesn't exist #2

Open Theo-Nelson opened 2 years ago

Theo-Nelson commented 2 years ago

Dear Maggi,

Recently I was analyzing sample DRR138513 from the European Nucleotide Archive. I received this error message while running FusionSeeker on the sorted bam output. I received the error whether running FusionSeeker with your original GitHub repo or with my fork.

what I noticed is that the program was searching for a particular record record_read_chr9_GL383539v1_alt, which has a GL number one less than the GL numbers which I do see within the directory. The image below displays this comparison.

Screen Shot 2022-09-10 at 6 11 38 PM

machine details:

NAME="Ubuntu" VERSION="18.04.6 LTS (Bionic Beaver)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 18.04.6 LTS" VERSION_ID="18.04" HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" VERSION_CODENAME=bionic UBUNTU_CODENAME=bionic

full pipeline outputs: https://github.com/Theo-Nelson/long-read-sequencing-pipeline-examples/blob/main/long_read_rna_seq_analysis-DRR138513.ipynb

Please let me know whether this is a problem with the sample or the program. Thank you very much!

Sincerely, Theo

kristianunger commented 9 months ago

Adding to this I got the same error:

Traceback (most recent call last): File "/home/kunger/data2/FusionSeeker/fusionseeker", line 104, in raw_signal.detect_from_split(defusion_args.outpath,goodchrom) File "/data2/core-strahl-kitso/FusionSeeker/raw_signal.py", line 373, in detect_from_split allsplitread+=open(outpath+'raw_signal/recordread'+chrom,'r').read().split('\n')[:-1] FileNotFoundError: [Errno 2] No such file or directory: 'fusion_out/raw_signal/record_read_chr18'

Any hint how this can be fixed?

yinyuanmtu commented 2 months ago

I got the same error. Hope the developed can help solve the issue.