PacificBiosciences / FALCON_unzip

Making diploid assembly becomes common practice for genomic study
BSD 3-Clause Clear License
30 stars 18 forks source link

Running falcon_unzip error: "[ERROR]Contents of '/data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/p_000000F.sh.log':" #51

Closed ls2017 closed 7 years ago

ls2017 commented 7 years ago
  1. I got the following error when I ran falcon_unzip:
    
    ls2017@c3n20:/data1/ls2017/falcon/tests/unzip/run1119$ ./unzip.sh 
    [INFO]Setup logging from file "None".
    [INFO]# of tasks in complete graph: 1
    [INFO]tick: 1, #updatedTasks: 0, sleep_time=0.000000
    [INFO]Queued 'task://localhost/track_reads' ...
    [INFO]Running task from function task_track_reads()
    [INFO]tick: 2, #updatedTasks: 1, sleep_time=0.000000
    [INFO](local) '/data1/ls2017/falcon/tests/unzip/run1119/track_reads.sh'
    [INFO]tick: 4, #updatedTasks: 1, sleep_time=0.200000
    [INFO]tick: 8, #updatedTasks: 1, sleep_time=0.600000
    [INFO]tick: 16, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 32, #updatedTasks: 1, sleep_time=1.000000
    [INFO]'/data1/ls2017/falcon/tests/unzip/run1119/track_reads_done.exit' found.
    [INFO]Success ('done'). Joining 'task://localhost/track_reads'...
    [INFO]_refreshTargets() finished with no thread running and no new job to submit
    [INFO]# of tasks in complete graph: 3
    [INFO]tick: 1, #updatedTasks: 0, sleep_time=0.000000
    [INFO]Queued 'task://localhost/aln_000000F' ...
    [INFO]Running task from function task_run_blasr()
    [INFO]tick: 2, #updatedTasks: 1, sleep_time=0.000000
    [INFO](local) '/data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/aln_000000F.sh'
    [INFO]tick: 4, #updatedTasks: 1, sleep_time=0.200000
    [INFO]tick: 8, #updatedTasks: 1, sleep_time=0.600000
    [INFO]tick: 16, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 32, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 64, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 128, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 256, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 512, #updatedTasks: 1, sleep_time=1.000000
    [INFO]tick: 1024, #updatedTasks: 1, sleep_time=1.000000
    [INFO]'/data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/aln_000000F_done.exit' found.
    [INFO]Success ('done'). Joining 'task://localhost/aln_000000F'...
    [INFO]Queued 'task://localhost/p_000000F' ...
    [INFO]Running task from function task_phasing()
    [INFO](local) '/data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/p_000000F.sh'
    [WARNING]Call 'bash /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/p_000000F.sh 1> /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/p_000000F.sh.log 2>&1' returned 256.
    [ERROR]Contents of '/data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/p_000000F.sh.log':
    trap 'touch /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/p_000000F_done.exit' EXIT
    + trap 'touch /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/p_000000F_done.exit' EXIT
    cd /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/
    + cd /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/
    hostname
    + hostname
    c3n20
    date
    + date
    Fri Nov 18 16:26:38 PST 2016
    cd /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/
    + cd /data1/ls2017/falcon/tests/unzip/run1119/./3-unzip/0-phasing/000000F/
    fc_phasing.py --bam /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/000000F_sorted.bam --fasta /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/reads/000000F_ref.fa --ctg_id 000000F --base_dir ../
    + fc_phasing.py --bam /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/000000F_sorted.bam --fasta /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/reads/000000F_ref.fa --ctg_id 000000F --base_dir ../
    No handlers could be found for logger "pypeflow.task"
    Exception in thread Thread-1:
    Traceback (most recent call last):
    File "/usr/lib/python2.7/threading.py", line 810, in __bootstrap_inner
    self.run()
    File "/usr/lib/python2.7/threading.py", line 763, in run
    self.__target(*self.__args, **self.__kwargs)
    File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 348, in __call__
    return self.runInThisThread(*argv, **kwargv)
    File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 369, in runInThisThread
    self.run(*argv, **kwargv)
    File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 277, in run
    rtn = self._runTask(self, *argv, **kwargv)
    File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 191, in _runTask
    return self._taskFun(self)
    File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/phasing.py", line 24, in make_het_call
    p = subprocess.Popen(shlex.split("samtools view %s %s" % (bam_fn, ctg_id) ), stdout=subprocess.PIPE)
    File "/usr/lib/python2.7/subprocess.py", line 710, in __init__
    errread, errwrite)
    File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child
    raise child_exception
    OSError: [Errno 2] No such file or directory

/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py:537: UserWarning: "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" "! Please wait for all threads / processes to terminate !" "! Also, maybe use 'ps' or 'qstat' to check all threads,!" "! processes and/or jobs are terminated cleanly. !" "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"

warnings.warn(shutdown_msg) Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/bin/fc_phasing.py", line 4, in import('pkg_resources').run_script('falcon-unzip==0.1.0', 'fc_phasing.py') File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 744, in run_script self.require(requires)[0].run_script(script_name, ns) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 1499, in run_script exec(code, namespace, namespace) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/EGG-INFO/scripts/fc_phasing.py", line 4, in main(sys.argv) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/phasing.py", line 574, in main phasing(args) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/phasing.py", line 554, in phasing wf.refreshTargets() File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 548, in refreshTargets raise Exception('Caused by:\n' + tb) Exception: Caused by: Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 523, in refreshTargets rtn = self._refreshTargets(task2thread, objs = objs, callback = callback, updateFreq = updateFreq, exitOnFailure = exitOnFailure) File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 740, in _refreshTargets raise TaskFailureError("Counted %d failure(s) with 0 successes so far." %failedJobCount) TaskFailureError: 'Counted 1 failure(s) with 0 successes so far.'

touch /data1/ls2017/falcon/tests/unzip/run1119/3-unzip/0-phasing/000000F/p_000000F_done.exit

[INFO]Failure ('fail'). Joining 'task://localhost/p_000000F'... [CRITICAL]Any exception caught in RefreshTargets() indicates an unrecoverable error. Shutting down... /data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py:537: UserWarning: "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" "! Please wait for all threads / processes to terminate !" "! Also, maybe use 'ps' or 'qstat' to check all threads,!" "! processes and/or jobs are terminated cleanly. !" "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"

warnings.warn(shutdown_msg) [WARNING]#tasks=2, #alive=0 Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/bin/fc_unzip.py", line 4, in import('pkg_resources').run_script('falcon-unzip==0.1.0', 'fc_unzip.py') File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 744, in run_script self.require(requires)[0].run_script(script_name, ns) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 1499, in run_script exec(code, namespace, namespace) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/EGG-INFO/scripts/fc_unzip.py", line 4, in main(sys.argv) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/unzip.py", line 468, in main unzip_all(config) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/unzip.py", line 384, in unzip_all wf.refreshTargets() File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 548, in refreshTargets raise Exception('Caused by:\n' + tb) Exception: Caused by: Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 523, in refreshTargets rtn = self._refreshTargets(task2thread, objs = objs, callback = callback, updateFreq = updateFreq, exitOnFailure = exitOnFailure) File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 740, in _refreshTargets raise TaskFailureError("Counted %d failure(s) with 0 successes so far." %failedJobCount) TaskFailureError: 'Counted 1 failure(s) with 0 successes so far.'

[INFO]Setup logging from file "None". [INFO]Setup logging from file "None". [INFO]# of tasks in complete graph: 1 [INFO]tick: 1, #updatedTasks: 0, sleep_time=0.000000 [WARNING]input does not exist yet (in this filesystem): PypeLocalFile('file://localhost/data1/ls2017/falcon/tests/unzip/run1119/3-unzip/1-hasm/hasm_done', '/data1/ls2017/falcon/tests/unzip/run1119/3-unzip/1-hasm/hasm_done') - waiting up to 60s [CRITICAL]Any exception caught in RefreshTargets() indicates an unrecoverable error. Shutting down... /data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py:537: UserWarning: "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" "! Please wait for all threads / processes to terminate !" "! Also, maybe use 'ps' or 'qstat' to check all threads,!" "! processes and/or jobs are terminated cleanly. !" "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"

warnings.warn(shutdown_msg) [WARNING]#tasks=0, #alive=0 Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/bin/fc_quiver.py", line 4, in import('pkg_resources').run_script('falcon-unzip==0.1.0', 'fc_quiver.py') File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 744, in run_script self.require(requires)[0].run_script(script_name, ns) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/init.py", line 1499, in run_script exec(code, namespace, namespace) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/EGG-INFO/scripts/fc_quiver.py", line 4, in main(sys.argv) File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/run_quiver.py", line 296, in main wf.refreshTargets() #force refresh now, will put proper dependence later File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 548, in refreshTargets raise Exception('Caused by:\n' + tb) Exception: Caused by: Traceback (most recent call last): File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 523, in refreshTargets rtn = self._refreshTargets(task2thread, objs = objs, callback = callback, updateFreq = updateFreq, exitOnFailure = exitOnFailure) File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/controller.py", line 637, in _refreshTargets if not (set(prereqJobURLs) & updatedTaskURLs) and taskObj.isSatisfied(): File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 150, in isSatisfied return not self._getRunFlag() File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 144, in _getRunFlag return any( [ f(self.inputDataObjs, self.outputDataObjs, self.parameters) for f in self._compareFunctions] ) File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/task.py", line 865, in timeStampCompare inputDataObjsTS.append((f.timeStamp, 'A', f)) File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/data.py", line 114, in timeStamp self.localFileName,repr(os.listdir(os.path.dirname(self.localFileName))))) OSError: [Errno 2] No such file or directory: '/data1/ls2017/falcon/tests/unzip/run1119/3-unzip/1-hasm'

ls2017@c3n20:/data1/ls2017/falcon/tests/unzip/run1119$ Write failed: Broken pipe


2. My fc_unzip.cfg is following:

```ini
[General]
job_type = local

[Unzip]
input_fofn= input.fofn
input_bam_fofn= input_bam.fofn

smrt_bin=/data1/ls2017/smrt/install/install/smrtlink-fromsrc_3.1.0.180439,180439-180439-180128-180128-180128/bundles/smrttools/smrtcmds/bin/

jobqueue =
sge_phasing=
sge_quiver=
sge_track_reads=
sge_blasr_aln=
sge_hasm=
unzip_concurrent_jobs = 6
quiver_concurrent_jobs = 6

Not sure how to solve this. Could anyone help me?

Thanks!

pb-cdunn commented 7 years ago

You might be mixing versions. The latest FALCON will not work with the latest public FALCON_unzip. We will update both when we have a passing test-case. For now, we recommend using the FALCON that is in the git-submodule here, which was used for the published paper.

ls2017 commented 7 years ago

Hi pb-cdunn,

Thank you for your reply! But I don't understand "using the FALCON that is in the git-submodule".

To install falcon, I followed the instructions on https://github.com/PacificBiosciences/FALCON-integrate/wiki/Installation

git clone git://github.com/PacificBiosciences/FALCON-integrate.git
cd FALCON-integrate
git checkout master  # or whatever version you want
make init
source env.sh
make config-edit-user
make -j all
make test  # to run a simple one

Could you tell me how to install the FALCON that is in the git-submodule which works well with the latest falcon_unzip?

Thanks!

pb-cdunn commented 7 years ago

It could be non-trivial. I haven't done it myself.

Anyway, I will update this week so that unzip can use the latest Falcon....

ls2017 commented 7 years ago

Hi pb-cdunn,

Thank you for your help!

Today I installed the latest falcon and falcon_unzip. Then I tried to run falcon_unzip again. I still got some error (different error message from before) as following:

ls2017@c3n20:/data1/ls2017/falcon_unzip/ecolitest$ ./unzip.sh 
[INFO]Setup logging from file "None".
[WARNING]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pwatcher/fs_based.pyc'>
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pwatcher/fs_based.pyc'>
[INFO]job_type='local', job_queue='UNSPECIFIED_QUEUE', sge_option=None
[INFO]Num unsatisfied: 1, graph: 1
[INFO]About to submit: Node(3-unzip/reads)
[INFO]starting job Job(jobid='P251d23b684173f', cmd='/bin/bash run.sh', rundir='/data1/ls2017/falcon_unzip/ecolitest/3-unzip/reads', options={'job_queue': 'UNSPECIFIED_QUEUE', 'sge_option': '', 'job_type': 'local'})
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P251d23b684173f', cmd='/bin/bash run.sh', rundir='/data1/ls2017/falcon_unzip/ecolitest/3-unzip/reads', options={'job_queue': 'UNSPECIFIED_QUEUE', 'sge_option': '', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[ERROR]Task Node(3-unzip/reads) failed with exit-code=256
[INFO]recently_satisfied: set([])
[INFO]Num satisfied in this iteration: 0
[INFO]Num still unsatisfied: 1
[ERROR]Some tasks are recently_done but not satisfied: set([Node(3-unzip/reads)])
[ERROR]ready: set([])
submitted: set([])
Traceback (most recent call last):
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/bin/fc_unzip.py", line 4, in <module>
    __import__('pkg_resources').run_script('falcon-unzip==0.1.0', 'fc_unzip.py')
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 743, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 1498, in run_script
    exec(code, namespace, namespace)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/EGG-INFO/scripts/fc_unzip.py", line 4, in <module>
    main(sys.argv)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/unzip.py", line 365, in main
    unzip_all(config)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/unzip.py", line 230, in unzip_all
    wf.refreshTargets() #force refresh now, will put proper dependence later
  File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/simple_pwatcher_bridge.py", line 209, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/simple_pwatcher_bridge.py", line 276, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: set([Node(3-unzip/reads)])
[INFO]Setup logging from file "None".
[INFO]config={'input_bam_fofn': '/data1/ls2017/falcon_unzip/ecolitest/input_bam.fofn',
 'job_type': 'local',
 'sge_quiver': '',
 'sge_track_reads': '',
 'smrt_bin': '/data1/ls2017/smrt/install/install/smrtlink-fromsrc_3.1.0.180439,180439-180439-180128-180128-180128/bundles/smrttools/smrtcmds/bin/'}
[WARNING]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pwatcher/fs_based.pyc'>
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pwatcher/fs_based.pyc'>
[INFO]job_type='local', job_queue='UNSPECIFIED_QUEUE', sge_option=None
[INFO]Num unsatisfied: 2, graph: 2
[INFO]About to submit: Node(4-quiver/track_reads_h)
[INFO]starting job Job(jobid='P8ee8355d8a19c8', cmd='/bin/bash run.sh', rundir='/data1/ls2017/falcon_unzip/ecolitest/4-quiver/track_reads_h', options={'job_queue': 'UNSPECIFIED_QUEUE', 'sge_option': '', 'job_type': 'local'})
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P8ee8355d8a19c8', cmd='/bin/bash run.sh', rundir='/data1/ls2017/falcon_unzip/ecolitest/4-quiver/track_reads_h', options={'job_queue': 'UNSPECIFIED_QUEUE', 'sge_option': '', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s

[ERROR]Task Node(4-quiver/track_reads_h) failed with exit-code=256
[INFO]recently_satisfied: set([])
[INFO]Num satisfied in this iteration: 0
[INFO]Num still unsatisfied: 2
[ERROR]Some tasks are recently_done but not satisfied: set([Node(4-quiver/track_reads_h)])
[ERROR]ready: set([])
submitted: set([])
Traceback (most recent call last):
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/bin/fc_quiver.py", line 4, in <module>
    __import__('pkg_resources').run_script('falcon-unzip==0.1.0', 'fc_quiver.py')
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 743, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 1498, in run_script
    exec(code, namespace, namespace)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/EGG-INFO/scripts/fc_quiver.py", line 4, in <module>
    main(sys.argv)
  File "/data1/ls2017/falcon/install/FALCON-integrate/fc_env/lib/python2.7/site-packages/falcon_unzip-0.1.0-py2.7.egg/falcon_unzip/run_quiver.py", line 355, in main
    wf.refreshTargets()
  File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/simple_pwatcher_bridge.py", line 209, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/data1/ls2017/falcon/install/FALCON-integrate/pypeFLOW/pypeflow/simple_pwatcher_bridge.py", line 276, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: set([Node(4-quiver/track_reads_h)])
ls2017@c3n20:/data1/ls2017/falcon_unzip/ecolitest$ 

Could you help with this? Thanks!

pb-jchin commented 7 years ago

Hi, @ls2017, currently, the status of this code is more for experienced users who has some development and HPC experience (https://github.com/PacificBiosciences/FALCON_unzip/wiki). The code is also for research to know exactly how we assemble diploid genomes. We won't be able to provide step-by-step instruction on how to execute the code for all possible environments here. If you are a PacBio customer, you might be able to contact PacBio customer support to give them feedback about your need and setup for assembling diploid assembly. Thanks.

pb-cdunn commented 7 years ago

run_quiver.py failed (in 4-quiver/track_reads_h) because run_unzip.py failed (in 3-unzip/reads). To learn why, look for stderr (probably in 3-unzip/reads/pwatcher.dir/stderr).

As Jason says, we cannot offer much support, but I am still curious about failures. Feel free to post them, but you might not get a response.