Closed nalbright closed 6 years ago
Dahak requires you to use singularity. This command does not use singularity. Please see the first section of the quick start:
The user should also pass the --use-singularity flag to tell Snakemake to use Singularity containers. This also requires the SINGULARITY_BINDPATH variable to be set, to bind-mount a local directory into the container.
Also see the requirements, listed on the main index and on the installation page:
REQUIRED:
- Singularity or Docker
You can copy and paste this into a terminal:
export SINGULARITY_BINDPATH="data:/data"
snakemake -p -n \
--configfile=config/custom_readfilt_workflow.json \
read_filtering_pretrim_workflow
Executed via redhat: snakemake -p --configfile=config/custom_readfilt_workflow.json read_filtering_pretrim_workflow
Output: 2018-07-13 12:19:50 (3.05 MB/s) - âdata/SRR606249_subset10_2_reads.fq.gzâ saved [368182802/368182802]
Finished job 5. 3 of 7 steps (43%) done
Job 2: --- Pre-trim quality check of trimmed data with fastqc.
fastqc -t 1 //data/SRR606249_subset10_1_reads.fq.gz /data/SRR606249_subset10_2_reads.fq.gz -o /data /usr/bin/bash: fastqc: command not found Error in rule pre_trimming_quality_assessment: jobid: 2 output: data/SRR606249_subset10_1_reads_fastqc.zip, data/SRR606249_subset10_2_reads_fastqc.zip
RuleException: CalledProcessError in line 162 of /data/home/nalbright/dahak/workflows/read_filtering/Snakefile: Command ' set -euo pipefail; fastqc -t 1 //data/SRR606249_subset10_1_reads.fq.gz /data/SRR606249_subset10_2_reads.fq.gz -o /data ' returned non-zero exit status 127. File "/data/home/nalbright/dahak/workflows/read_filtering/Snakefile", line 162, in __rule_pre_trimming_quality_assessment File "/data/home/nalbright/miniconda3/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /data/home/nalbright/dahak/workflows/.snakemake/log/2018-07-13T121116.897357.snakemake.log