Closed bgbrink closed 5 years ago
How did you install Unzip? (The Github version is too old.)
To debug this, look in 3-unzip/1-hasm
and 4-quiver/track_reads
. Look for a file called something like *.stderr
.
Also, you need to post your .cfg
so I can see your pypeflow settings. (If you use the fs_based pwatcher, then stderr will be in the pwatcher.dir. With the blocking pwatcher, it's in the run-dir.)
Also, you should always always always run on a small test-case first. We have a pretty quick one called greg200k-sv2
available via FALCON-examples: https://github.com/pb-cdunn/FALCON-examples
cd FALCON-examples/run/greg200k-sv2
git-sym .
# edit *.cfg
make falcon
make unzip
But you need the Unzip tarball to be at all up-to-date: https://github.com/PacificBiosciences/FALCON_unzip/wiki/Binaries
And in general, you need to provide a lot more information. I am not allowed much time to help via GitHub.
@pb-cdunn I have the same problem! I installed the FALCON_unzip according to https://pb-falcon.readthedocs.io/en/latest/quick_start.html with the tar ball 3/12/2018. The tree of the output dir as
|-- 0-rawreads
| |-- build_rdb.sh
| |-- cns-chunks
| |-- cns-gather
| |-- cns-runs
| |-- cns-split
| |-- daligner-chunks
| |-- daligner-gathered
| |-- daligner-intermediate-gathered-las
| |-- daligner-runs
| |-- daligner-split
| |-- las-gather
| |-- las-merge-chunks
| |-- las-merge-gathered
| |-- las-merge-runs
| |-- las-merge-split
| |-- length_cutoff
| |-- my.input.fofn
| |-- preads
| |-- pwatcher.dir
| |-- raw_reads.db
| |-- rdb_build_done
| |-- report
| |-- run_jobs.sh
| |-- run.sh
| |-- run.sh.done
| |-- task.json
| |-- task.sh
| |-- template.sh
| `-- user_script.sh
|-- 1-preads_ovl
| |-- build_pdb.sh
| |-- daligner-chunks
| |-- daligner-gathered
| |-- daligner-intermediate-gathered-las
| |-- daligner-runs
| |-- daligner-split
| |-- db2falcon
| |-- las-gather
| |-- las-merge-chunks
| |-- las-merge-gathered
| |-- las-merge-runs
| |-- las-merge-split
| |-- pdb_build_done
| |-- preads.db
| |-- preads.fofn
| |-- pwatcher.dir
| |-- run_jobs.sh
| |-- run.sh
| |-- run.sh.done
| |-- task.json
| |-- task.sh
| |-- template.sh
| `-- user_script.sh
|-- 2-asm-falcon
| |-- a_ctg_all.fa
| |-- a_ctg_base.fa
| |-- a_ctg_base_tiling_path
| |-- a_ctg.fa
| |-- a_ctg_tiling_path
| |-- asm.gfa
| |-- chimers_nodes
| |-- c_path
| |-- ctg_paths
| |-- falcon_asm_done
| |-- fc_ovlp_to_graph.log
| |-- p_ctg.fa
| |-- p_ctg_tiling_path
| |-- preads4falcon.fasta
| |-- preads.ovl
| |-- pwatcher.dir
| |-- run.sh
| |-- run.sh.done
| |-- sg_edges_list
| |-- sg.gfa
| |-- task.json
| |-- task.sh
| |-- template.sh
| |-- user_script.sh
| `-- utg_data
|-- 3-unzip
| `-- reads
|-- 4-quiver
| `-- track_reads
|-- all.log
|-- assembly.sh
|-- assembly.sh.e1059657
|-- assembly.sh.e1059658
|-- assembly.sh.o1059657
|-- assembly.sh.o1059658
|-- bam
| |-- m54168_180413_073102.subreads.bam
| `-- m54168_180413_073102.subreadset.xml
|-- config.json
|-- falcon_unzip.e
|-- falcon_unzip.o
|-- fc_run_fungal.cfg
|-- fc_run.log
|-- fc_unzip.cfg
|-- fc_unzip.log
|-- fc_unzip_quiver.log
|-- foo.snake
|-- input_bam.fofn
|-- input.fofn
|-- mypwatcher
| |-- exits
| |-- heartbeats
| |-- jobs
| `-- wrappers
`-- rhc_falcon_unzip.sh
the fc_unzip.cfg
[General]
#job_type = SGE
job_type = local
#job_queue = default
[Unzip]
input_fofn= input.fofn
input_bam_fofn= input_bams.fofn
#path to bin directory containing samtools, blasr, and various GenomicConsensus utilities
smrt_bin=/home/zengqd/software/geno_ass/FALCON/fc_env/bin /home/zengqd/software/geno_ass/FALCON/GenomicConsensus/bin
#smrt_bin=/share/nas3/zqd/wheat-trans/soft/samtools-1.9/samtools /share/nas3/zqd/wheat-trans/soft/gen_ass/FALCON/fc_env/bin/blasr /share/nas3/zqd/wheat-trans/soft/gen_ass/FALCON/fc_env/bin/ /share/nas3/zqd/wheat-trans/soft/gen_ass/GenomicConsensus-2.3.2/bin
#smrt_bin=/path/to/smrtcmds/bin/
#sge_phasing= -pe smp 12
#sge_quiver= -pe smp 24
#sge_track_reads= -pe smp 12
#sge_blasr_aln= -pe smp 24
#sge_hasm= -pe smp 48
#unzip_blasr_concurrent_jobs = 80
unzip_phasing_concurrent_jobs =24
quiver_concurrent_jobs = 24
The stderr and stdout as attachment
You would need to check stderr for the failing task:
ERROR - Task Node(4-quiver/track_reads) failed with exit-code=256
I succesfully completed a FALCON assembly and now I tried running FALCON-unzip, where I get the following error message:
Looking into the
stderr
file of the failed job, the reason seems to be a hardcoded timeout:Do you have any idea how I could fix this? Any help is much appreciated.