epi2me-labs / wf-metagenomics

Metagenomic classification of long-read sequencing data
Other
58 stars 23 forks source link

error exit status (255) #108

Closed katievigil closed 4 months ago

katievigil commented 4 months ago

Ask away!

Hi this is my first time running EPI2ME labs on HPC and I am getting this error message below when I try to run my samples:

nextflow run epi2me-labs/wf-metagenomics \

-w ${OUTPUT}/work \
-profile singularity \
--fastq /ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal \
--out_dir ${OUTPUT} \
--taxonomic_rank S \
--database_set 'Standard-8' \
--real_time

N E X T F L O W ~ version 24.04.2

Launching https://github.com/epi2me-labs/wf-metagenomics [determined_bardeen] DSL2 - revision: 4c2c583cfd [master]

WARN: NEXTFLOW RECURSION IS A PREVIEW FEATURE - SYNTAX AND FUNCTIONALITY CAN CHANGE IN FUTURE RELEASES

|||||||||| _ __ _ __ ____ |||||||||| | ____| _ | | \/ | __| | | | | ||||| | | | |) | | ) | |\/| | _| ___| |/ ` | ' \/ | ||||| | |_| /| | / /| | | | |_|| | (| | |) _ \ |||||||||| |____|_| |_|___|| ||| ||\,|._/|/ |||||||||| wf-metagenomics v2.10.1-g4c2c583

Core Nextflow options revision : master runName : determined_bardeen containerEngine: singularity container : [withLabel:wfmetagenomics:ontresearch/wf-metagenomics:sha44a6dacff5f2001d917b774647bb4cbc1b53bc76, withLabel:wf_common:ontresearch/wf-common:sha338caea0a2532dc0ea8f46638ccc322bb8f9af48, withLabel:amr:ontresearch/abricate:sha2c763f19fac46035437854f1e2a5f05553542a78] launchDir : /ddnB/work/kvigil/Programs/libseccomp-2.5.2/cryptsetup-2.4.0/singularity-ce-3.10.2 workDir : /ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal/results/epi2me_out/work projectDir : /home/kvigil/.nextflow/assets/epi2me-labs/wf-metagenomics userName : kvigil profile : singularity configFiles : /home/kvigil/.nextflow/assets/epi2me-labs/wf-metagenomics/nextflow.config

Input Options fastq : /ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal

Real Time Analysis Options real_time : true

Reference Options database_sets : [ncbi_16s_18s:[reference:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s/ncbi_targeted_loci_16s_18s.fna, database:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s/ncbi_targeted_loci_kraken2.tar.gz, ref2taxid:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s/ref2taxid.targloci.tsv, taxonomy:https://ftp.ncbi.nlm.nih.gov/pub/taxonomy/taxdump_archive/taxdmp_2023-01-01.zip], ncbi_16s_18s_28s_ITS:[reference:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s_28s_ITS/ncbi_16s_18s_28s_ITS.fna, database:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s_28s_ITS/ncbi_16s_18s_28s_ITS_kraken2.tar.gz, ref2taxid:https://ont-exd-int-s3-euwst1-epi2me-labs.s3.amazonaws.com/wf-metagenomics/ncbi_16s_18s_28s_ITS/ref2taxid.ncbi_16s_18s_28s_ITS.tsv, taxonomy:https://ftp.ncbi.nlm.nih.gov/pub/taxonomy/taxdump_archive/taxdmp_2023-01-01.zip], SILVA_138_1:[database:null], Standard-8:[database:https://genome-idx.s3.amazonaws.com/kraken/k2_standard_08gb_20231009.tar.gz, taxonomy:https://ftp.ncbi.nlm.nih.gov/pub/taxonomy/taxdump_archive/new_taxdump_2023-03-01.zip], PlusPF-8:[database:https://genome-idx.s3.amazonaws.com/kraken/k2_pluspf_08gb_20230314.tar.gz, taxonomy:https://ftp.ncbi.nlm.nih.gov/pub/taxonomy/taxdump_archive/new_taxdump_2023-03-01.zip], PlusPFP-8:[database:https://genome-idx.s3.amazonaws.com/kraken/k2_pluspfp_08gb_20230314.tar.gz, taxonomy:https://ftp.ncbi.nlm.nih.gov/pub/taxonomy/taxdump_archive/new_taxdump_2023-03-01.zip]]

Output Options out_dir : /ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal/results/epi2me_out

!! Only displaying parameters that differ from the pipeline defaults !!

If you use epi2me-labs/wf-metagenomics for your analysis please cite:


This is epi2me-labs/wf-metagenomics v2.10.1-g4c2c583.

Checking inputs. Note: Memory available to the workflow must be slightly higher than size of the database Standard-8 index (8GB) or consider to use --kraken2_memory_mapping Searching input for [.fastq, .fastq.gz, .fq, .fq.gz] files. [- ] fastcat - [- ] fastcat - [- ] prepare_databases:download_unpack_taxonomy - [- ] prepare_databases:unpack_download_kraken2_database - executor > local (4) [- ] fastcat - [7a/8e8cc2] prepare_databases:download_unpack_taxonomy | 1 of 1, failed: 1 ✘ [16/4fc312] prepare_databases:unpack_download_kraken2_database | 0 of 1 [- ] prepare_databases:determine_bracken_length - [86/d6c32a] real_time_pipeline:run_common:getVersions | 1 of 1, failed: 1 ✘ [58/574dfb] real_time_pipeline:run_common:getParams | 1 of 1, failed: 1 ✘ [- ] real_time_pipeline:kraken_server - [- ] real_time_pipeline:kraken2_client - [- ] real_time_pipeline:progressive_stats - [- ] real_time_pipeline:progressive_kraken_reports - [- ] real_time_pipeline:progressive_bracken - [- ] real_time_pipeline:createAbundanceTables - [- ] real_time_pipeline:makeReport - [- ] real_time_pipeline:output_results - [- ] real_time_pipeline:stop_kraken_server - Note: Empty files or those files whose reads have been discarded after filtering based on read length and/or read quality will not appear in the report and will be excluded from subsequent analysis. Kraken2 pipeline. Preparing databases. Using default taxonomy database. Unpacking kraken2 indexes Workflow will run indefinitely as no read_limit is set. Workflow will stop processing files after null reads. Pulling Singularity image docker://ontresearch/wf-metagenomics:sha44a6dacff5f2001d917b774647bb4cbc1b53bc76 [cache /work/kvigil/.singularity/ontresearch-wf-metagenomics-sha44a6dacff5f2001d917b774647bb4cbc1b53bc76.img] Pulling Singularity image docker://ontresearch/wf-common:sha338caea0a2532dc0ea8f46638ccc322bb8f9af48 [cache /work/kvigil/.singularity/ontresearch-wf-common-sha338caea0a2532dc0ea8f46638ccc322bb8f9af48.img] ERROR ~ Error executing process > 'prepare_databases:unpack_download_kraken2_database'

Caused by: Process prepare_databases:unpack_download_kraken2_database terminated with an error exit status (255)

Command executed:

Check if the folder is an url to fetch or a local path

if true then wget 'https://genome-idx.s3.amazonaws.com/kraken/k2_standard_08gb_20231009.tar.gz' fi if [[ k2_standard_08gb_20231009.tar.gz == *.tar.gz ]] then mkdir k2_standard_08gb_20231009_db tar xf k2_standard_08gb_20231009.tar.gz -C k2_standard_08gb_20231009_db executor > local (4) [- ] fastcat - [7a/8e8cc2] prepare_databases:download_unpack_taxonomy | 1 of 1, failed: 1 ✘ [16/4fc312] prepare_databases:unpack_download_kraken2_database | 1 of 1, failed: 1 [- ] prepare_databases:determine_bracken_length - [86/d6c32a] real_time_pipeline:run_common:getVersions | 1 of 1, failed: 1 ✘ [58/574dfb] real_time_pipeline:run_common:getParams | 1 of 1, failed: 1 ✘ [- ] real_time_pipeline:kraken_server - [- ] real_time_pipeline:kraken2_client - [- ] real_time_pipeline:progressive_stats - [- ] real_time_pipeline:progressive_kraken_reports - [- ] real_time_pipeline:progressive_bracken - [- ] real_time_pipeline:createAbundanceTables - [- ] real_time_pipeline:makeReport - [- ] real_time_pipeline:output_results - [- ] real_time_pipeline:stop_kraken_server - Note: Empty files or those files whose reads have been discarded after filtering based on read length and/or read quality will not appear in the report and will be excluded from subsequent analysis. Kraken2 pipeline. Preparing databases. Using default taxonomy database. Unpacking kraken2 indexes Workflow will run indefinitely as no read_limit is set. Workflow will stop processing files after null reads. Pulling Singularity image docker://ontresearch/wf-metagenomics:sha44a6dacff5f2001d917b774647bb4cbc1b53bc76 [cache /work/kvigil/.singularity/ontresearch-wf-metagenomics-sha44a6dacff5f2001d917b774647bb4cbc1b53bc76.img] Pulling Singularity image docker://ontresearch/wf-common:sha338caea0a2532dc0ea8f46638ccc322bb8f9af48 [cache /work/kvigil/.singularity/ontresearch-wf-common-sha338caea0a2532dc0ea8f46638ccc322bb8f9af48.img] ERROR ~ Error executing process > 'prepare_databases:unpack_download_kraken2_database'

Caused by: Process prepare_databases:unpack_download_kraken2_database terminated with an error exit status (255)

Command executed:

Check if the folder is an url to fetch or a local path

if true then wget 'https://genome-idx.s3.amazonaws.com/kraken/k2_standard_08gb_20231009.tar.gz' fi if [[ k2_standard_08gb_20231009.tar.gz == .tar.gz ]] then mkdir k2_standard_08gb_20231009_db tar xf k2_standard_08gb_20231009.tar.gz -C k2_standard_08gb_20231009_db elif [[ k2_standard_08gb_20231009.tar.gz == .zip ]] then mkdir k2_standard_08gb_20231009.tar.gz_db unzip k2_standard_08gb_20231009.tar.gz -d k2_standard_08gb_20231009_db else echo "Error: database is neither .tar.gz , .zip" echo "Exiting". exit 1 fi

Command exit status: 255

Command output: (empty)

Command error: FATAL: singularity image is not owned by required group(s) cp: ‘.command.out’ and ‘.command.out’ are the same file cp: ‘.command.err’ and ‘.command.err’ are the same file cp: cannot stat ‘.command.trace’: No such file or directory

Work dir: /ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal/results/epi2me_out/work/16/4fc312555f16287e8e0ba1ec5a06fa

Tip: when you have fixed the problem you can continue the execution adding the option -resume to the run command line

-- Check '.nextflow.log' file for details

katievigil commented 4 months ago

INFO: Converting SIF file to temporary sandbox... FATAL: while extracting /work/kvigil/.singularity/ontresearch-wf-metagenomics-sha44a6dacff5f2001d917b774647bb4cbc1b53bc76.img: root filesystem extraction failed: extract command failed: ERROR : Failed to create user namespace: user namespace disabled : exit status 1 cp: '.command.out' and '/ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal/results/epi2me_out/work/07/5d55e8ae398768fbfdf3369dec867f/.command.out' are the same file cp: '.command.err' and '/ddnB/work/kvigil/onr.raw.data.sup.trim/rawdata/results/trimmed/sd_marinemammal/results/epi2me_out/work/07/5d55e8ae398768fbfdf3369dec867f/.command.err' are the same file cp: cannot stat '.command.trace': No such file or directory

katievigil commented 4 months ago

I think this is because I need root privileges to run singularity?

nggvs commented 4 months ago

Hi @lucyintheskyzzz , This sounds like this is an issue with the Singularity setup on your HPC. Was it installed on the HPC or did you attempt to use an installation in your local environment? Generally, it might be best to reach out to the HPC admins with this issue.

katievigil commented 4 months ago

@nggvs thanks for getting back to me. I think its a singularity permissions issue on my end. Ill talk to the HPC administrators.

nggvs commented 4 months ago

Thank you! Feel free to open new issues if something looks weird!