simoncchu / GAPPadder

GAPPadder is tool for closing gaps on draft genomes with short sequencing data
27 stars 7 forks source link

AttributeError at Collect stage #11

Closed jzohren closed 2 years ago

jzohren commented 3 years ago

I'm trying to run gappadder on my data, but run into problems at the Collect stage. This is the error I receive:

Traceback (most recent call last):
  File "/camp/lab/turnerj/working/jasmin/software/GAPPadder/main.py", line 283, in <module>
    main_func(scommand,sfconfig)
  File "/camp/lab/turnerj/working/jasmin/software/GAPPadder/main.py", line 254, in main_func
    drc.collect_discordant_regions_v2(sfout_discord_pos)
  File "/camp/lab/turnerj/working/jasmin/software/GAPPadder/run_multi_threads_discordant.py", line 122, in collect_discordant_regions_v2
    ftemp.close()
AttributeError: 'NoneType' object has no attribute 'close'

My config file looks like this:

{
    "draft_genome": {
        "fa": "/camp/lab/turnerj/working/jasmin/data/Dovetail/data/DT2_majorScfs_chrUn.fasta"
    },
    "raw_reads": [
            {
                "left": "/camp/lab/turnerj/working/shared_projects/Dovetail_opossum_WGS/raw_data/DTG_DNA_465_S61_L003_R1_001.fastq.gz",
                "right": "/camp/lab/turnerj/working/shared_projects/Dovetail_opossum_WGS/raw_data/DTG_DNA_465_S61_L003_R2_001.fastq.gz"
            }
      ],
    "alignments": [
            {
                "bam": "/camp/lab/turnerj/working/jasmin/data/Dovetail/bams/10x_reads_to_DT2.bam",
                "is": "300",
                "std": "120"
            }
      ],
    "software_path": {
        "bwa": "/camp/apps/eb/software/BWA/0.7.17-foss-2018b/bin/bwa",
        "samtools": "/camp/apps/eb/software/SAMtools/1.3.1-foss-2016b/bin/samtools",
        "velvet": "/camp/apps/eb/dev/software/Velvet/1.2.10-foss-2016b-mt-kmer_37/bin",
        "kmc": "/camp/apps/eb/software/khmer/1.4.1-foss-2016b-Python-2.7.12/bin",
        "TERefiner": "/camp/lab/turnerj/working/jasmin/software/GAPPadder/TERefiner_1",
        "ContigsMerger": "/camp/lab/turnerj/working/jasmin/software/GAPPadder/ContigsMerger"
    },
    "parameters": {
            "working_folder": "/camp/lab/turnerj/working/jasmin/data/Dovetail/gappadder",
        "min_gap_size": "1000",
        "flank_length": "300",
        "nthreads": "1",
        "verbose": "1"
        },
        "kmer_length": [{
                        "k": 30,
                        "k_velvet": [{
                                "k": 29
                        }, 
                        {
                                "k": 27
                        }]
                }, 
                {
                        "k": 40,
                        "k_velvet": [{
                                "k": 39
                        }, 
                        {
                                "k": 37
                        }]
                },
                {
                        "k": 50,
                        "k_velvet": [{
                                "k": 49
                        }, 
                        {
                                "k": 47
                        }]
                }]
}

Any idea how to fix this? Thanks!

Brent-Saylor-Canopy commented 3 years ago

I had a similar issue. Check this issue page https://github.com/simoncchu/GAPPadder/issues/3

papelypluma commented 2 years ago

Hi @jzohren! By any chance, were you able to address this issue? I've tried using uncompressed reads file as suggested by @Brent-Saylor-Canopy, but it doesn't seem to address the issue. I'm getting exactly the same error as you posted here.

papelypluma commented 2 years ago

I realized that (1) fastq reads have to be unzipped, and (2) bam files have to be indexed to address this issue. My bad I totally forgot about indexing output bam files. I guess these steps address this issue.

jzohren commented 2 years ago

Hi @papelypluma, thanks for chipping in and sorry for my late reply. Great to hear that you figured out how to run it with your data. In my case I realised I had to adjust a few parameters, like insert size etc. and then got it to work as well.