Closed nalbright closed 6 years ago
hi, apologies -- please add
--use-singularity
to the snakemake command line!
best, --titus
Thanks for the quick response! I went ahead and added the flag, but received the following error that it cannot find a two of the files that are specified to be downloaded in the input .json (from the Really Quick Copy and Paste Quick Start):
Building DAG of jobs... Using shell: /bin/bash Provided cores: 1 Rules claiming more threads will be scaled down. Job counts: count jobs 1 download_reads 2 pre_trimming_quality_assessment 1 read_filtering_pretrim_workflow 4
Job 2: --- Pre-trim quality check of trimmed data with fastqc.
fastqc -t 1 //data/SRR606249_subset25_1_reads.fq.gz /data/SRR606249_subset25_2_reads.fq.gz -o /data Activating singularity image /home/user/dahak_2018/dahak/workflows/.snakemake/singularity/f1d03c0a142609dc68fd5a6943abcaad.simg perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = "en_US", LC_ALL = (unset), LANG = "en_US.UTF-8" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). Skipping '//data/SRR606249_subset25_1_reads.fq.gz' which didn't exist, or couldn't be read Skipping '/data/SRR606249_subset25_2_reads.fq.gz' which didn't exist, or couldn't be read Waiting at most 5 seconds for missing files. MissingOutputException in line 143 of /home/user/dahak_2018/dahak/workflows/read_filtering/Snakefile: Missing files after 5 seconds: data/SRR606249_subset25_1_reads_fastqc.zip data/SRR606249_subset25_2_reads_fastqc.zip This might be due to filesystem latency. If that is the case, consider to increase the wait time with --latency-wait. Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /home/user/dahak_2018/dahak/workflows/.snakemake/log/2018-07-16T151421.927736.snakemake.log
excellent - did you run the 'export' command just before the snakemake command in the tutorial?
I did but I must have started a new session and forgot to run it again- But success, it worked! :)
Looking back, I remember now that early in the documentation it mentions to add the --use-singularity flag. Perhaps it would be useful to add this to the Really Quick Copy and Paste commands so others don't make my same mistake and forget to add it after a copy and paste?
Thanks for your help Titus!
yep, we are fixin' the things!
thanks!
Addressed by #114 from @ctb, which has been merged into the comparison-docs
branch
In an unbuntu OS I could not get the wget link to download singularity so I got it directly from the webpage listed in the install. Everything else installed smoothly.
As per the really Quick Copy And Paste Quick Start I ran: snakemake -p --configfile=config/custom_readfilt_workflow.json read_filtering_pretrim_workflow
ERROR:
2 of 7 steps (29%) done Job 2: --- Pre-trim quality check of trimmed data with fastqc. fastqc -t 1 //data/SRR606249_subset25_1_reads.fq.gz /data/SRR606249_subset25_2_reads.fq.gz -o /data /bin/bash: fastqc: command not found Error in rule pre_trimming_quality_assessment: jobid: 2 output: data/SRR606249_subset25_1_reads_fastqc.zip, data/SRR606249_subset25_2_reads_fastqc.zip RuleException: CalledProcessError in line 162 of /home/user/dahak_2018/dahak/workflows/read_filtering/Snakefile: Command ' set -euo pipefail; fastqc -t 1 //data/SRR606249_subset25_1_reads.fq.gz /data/SRR606249_subset25_2_reads.fq.gz -o /data ' returned non-zero exit status 127. File "/home/user/dahak_2018/dahak/workflows/read_filtering/Snakefile", line 162, in __rule_pre_trimming_quality_assessment File "/home/user/miniconda3/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /home/user/dahak_2018/dahak/workflows/.snakemake/log/2018-07-16T141402.533945.snakemake.log