metagenome-atlas / atlas_analyze

Scripts to get the most of the output of metagenome-atlas
5 stars 1 forks source link

error in rule get_taxonomy #8

Closed slambrechts closed 3 years ago

slambrechts commented 3 years ago

When running

cd atlas_analyze
python analyze.py /scratch/gent/vo/000/gvo00043/vsc42339/MICROBIAN/CLEAN_READS -s ./Snakefile

I get an error in job 3 out of 7: rule get_taxonomy:

Error in rule get_taxonomy:
    jobid: 3
    output: Results/taxonomy.tsv

Traceback (most recent call last):
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/executors/__init__.py", line 593, in _callback
    raise ex
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/executors/__init__.py", line 579, in cached_or_run
    run_func(*args)
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/executors/__init__.py", line 2460, in run_wrapper
    raise ex
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/executors/__init__.py", line 2441, in run_wrapper
    runtime_sourcecache_path,
  File "/kyukon/scratch/gent/vo/000/gvo00043/vsc42339/atlas_analyze/Snakefile", line 97, in __rule_get_taxonomy
    "Results/mapping_rate.tsv"
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/script.py", line 1365, in script
    executor.evaluate()
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/script.py", line 377, in evaluate
    self.execute_script(fd.name, edit=edit)
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/script.py", line 578, in execute_script
    self._execute_cmd("{py_exec} {fname:q}", py_exec=py_exec, fname=fname)
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/script.py", line 421, in _execute_cmd
    **kwargs
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/shell.py", line 265, in __new__
    raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'set -euo pipefail;  /scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/bin/python3.7 /kyukon/scratch/gent/vo/000/gvo00043/vsc42339/MICROBIAN/CLEAN_READS/.snakemake/scripts/tmp2ifocuhj.get_taxonomy.py' returned non-zero exit status 1.
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /kyukon/scratch/gent/vo/000/gvo00043/vsc42339/MICROBIAN/CLEAN_READS/.snakemake/log/2021-09-27T174323.495744.snakemake.log
Traceback (most recent call last):
  File "analyze.py", line 21, in <module>
    "snakemake "
  File "/scratch/gent/vo/000/gvo00043/vsc42339/conda/envs/analyze/lib/python3.7/site-packages/snakemake/shell.py", line 265, in __new__
    raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'set -euo pipefail;  snakemake -d /kyukon/scratch/gent/vo/000/gvo00043/vsc42339/MICROBIAN/CLEAN_READS -j 1 -s /Snakefile -s ./Snakefile' returned non-zero exit status 1.

complete snakemake log:

Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Job stats:
job                 count    min threads    max threads
----------------  -------  -------------  -------------
all                     1              1              1
analyze                 1              1              1
convert_nb              1              1              1
get_annotations         1              1              1
get_mapping_rate        1              1              1
get_taxonomy            1              1              1
import_files            1              1              1
total                   7              1              1

Select jobs to execute...

[Mon Sep 27 17:43:30 2021]
rule get_mapping_rate:
    input: stats/read_counts.tsv, genomes/counts/raw_counts_genomes.tsv
    output: Results/mapping_rate.tsv
    jobid: 4
    resources: tmpdir=/tmp

[Mon Sep 27 17:43:39 2021]
Finished job 4.
1 of 7 steps (14%) done
Select jobs to execute...

[Mon Sep 27 17:43:39 2021]
rule get_annotations:
    input: genomes/annotations/gene2genome.tsv.gz, Genecatalog/annotations/eggNog.tsv.gz
    output: genomes/annotations/KO.tsv, genomes/annotations/CAZy.tsv, Genecatalog/annotations/KO.tsv, Genecatalog/annotations/CAZy.tsv
    jobid: 6
    resources: tmpdir=/tmp, mem=60

[Mon Sep 27 17:59:30 2021]
Finished job 6.
2 of 7 steps (29%) done
Select jobs to execute...

[Mon Sep 27 17:59:30 2021]
localrule get_taxonomy:
    input: genomes/taxonomy/gtdb/gtdbtk.bac120.summary.tsv
    output: Results/taxonomy.tsv
    jobid: 3
    resources: tmpdir=/tmp

[Mon Sep 27 17:59:38 2021]
Error in rule get_taxonomy:
    jobid: 3
    output: Results/taxonomy.tsv

Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /kyukon/scratch/gent/vo/000/gvo00043/vsc42339/MICROBIAN/CLEAN_READS/.snakemake/log/2021-09-27T174323.495744.snakemake.log

rules get_mapping_rateand get_annotations seem to have finished fine, but somehow get_taxonomywas not able to run

SilasK commented 3 years ago

I'm trying to integrate the atlas_analyze into atlas. the atlas version 2.8 (only on Github for the moment) creates the file "genomes/taxonomy/gtdb_taxonomy.tsv"

slambrechts commented 3 years ago

@SilasK that's great!

In case this might be helpful to anyone, downgrading snakemake to v6.3 also solved this problem! All the atlas_analyze steps were able to run now!