PacificBiosciences / FALCON-integrate

Mostly deprecated. See https://github.com/PacificBiosciences/FALCON_unzip/wiki/Binaries
https://github.com/PacificBiosciences/FALCON/wiki/Manual
31 stars 28 forks source link

configure error #197

Open shiyi-pan opened 4 years ago

shiyi-pan commented 4 years ago

I installed the falcon version falcon-2017.06.28-18.01-py2.7-ucs4.tar.gz and run the fc_run.py script with the test greg200k-sv2 data and met an error like that : [INFO]Apparently '/ds3512/home/panyp/denovo/falcon/ecoli2' is not in lustre filesystem, which is fine. [INFO]fc_run started with configuration /ds3512/home/panyp/denovo/falcon/ecoli2/fc_run.cfg [WARNING]You have several old-style options. These should be provided in the [job.defaults] or [job.step.*] sections, and possibly renamed. See https://g ithub.com/PacificBiosciences/FALCON/wiki/Configuration ['job_type', 'sge_option_cns', 'sge_option_da', 'sge_option_fc', 'sge_option_la', 'sge_option_pda', 'sge_option_pla'] [WARNING]Unexpected keys in input config: set(['ovlp_HPCTANmask_option', 'pa_concurrent_jobs', 'sge_option_da', 'sge_option_pla', 'job_type', 'ovlp_concurren t_jobs', 'jobqueue', 'sge_option_fc', 'sge_option_cns', 'sge_option_la', 'sge_option_pda']) [WARNING]Please specify "job_type" only in the [job.defaults] section, not in [General]. [WARNING]Please supply a default for "njobs" (aka concurrency) in section [job.defaults]. For now, we will use 8 [INFO]cfg= { "General": { "LA4Falcon_preload": false, "avoid_text_file_busy": true, "bestn": 12, "dazcon": false, "falcon_sense_greedy": false, "falcon_sense_option": "--output-multi --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 6", "falcon_sense_skip_contained": false, "fc_ovlp_to_graph_option": " --min_len 12000", "genome_size": 0, "input_fofn": "input.fofn", .................................................................... Traceback (most recent call last): File "/ds3512/home/panyp/denovo/falcon/falcon-py2.7-ucs4/bin/fc_run.py", line 5, in main(sys.argv) File "/ds3512/home/panyp/ruanjian/Python2.7/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 724, in main main1(argv[0], args.config, args.logger) File "/ds3512/home/panyp/ruanjian/Python2.7/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 60, in main1 check_general_config(general_config, input_config_fn) File "/ds3512/home/panyp/ruanjian/Python2.7/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 40, in check_general_config raise Exception(msg) Exception: Missing options. We now require both "pa_daligner_option" (stage 0) and "ovlp_daligner_option" (stage 1), which are automatically passed along to HPC.daligner HPC.TANmask HPC.REPmask

These can provide additional flags: pa_HPCdaligner_option pa_HPCTANmask_option ovlp_HPCdaligner_option pa_REPmask_code (-g/-c pairs for 3 iterations, e.g. '1,20;5,15;20,10') I thought my configure file had someting wrong ,here is my fc_run.cfg. [General] job_type = SGE

list of files of the initial bas.h5 files

input_fofn = input.fofn

input_fofn = preads.fofn

input_type = raw

input_type = preads

The length cutoff used for seed reads used for initial mapping

length_cutoff = 12000

The length cutoff used for seed reads usef for pre-assembly

length_cutoff_pr = 12000

jobqueue = your_queue sge_option_da = -pe smp 8 -q %(jobqueue)s sge_option_la = -pe smp 2 -q %(jobqueue)s sge_option_pda = -pe smp 8 -q %(jobqueue)s sge_option_pla = -pe smp 2 -q %(jobqueue)s sge_option_fc = -pe smp 24 -q %(jobqueue)s sge_option_cns = -pe smp 8 -q %(jobqueue)s

pa_concurrent_jobs = 32 ovlp_concurrent_jobs = 32

pa_HPCdaligner_option = -v -B24 -t16 -e.70 -l1000 -s1000 ovlp_HPCdaligner_option = -v -B24 -t32 -h60 -e.96 -l500 -s1000

pa_DBsplit_option = -x500 -s200 ovlp_DBsplit_option = -x500 -s200

falcon_sense_option = --output-multi --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 6

overlap_filtering_setting = --max-diff 100 --max-cov 100 --min-cov 20 --bestn 10 --n-core 24 can you give me a favor?