Closed ShaelynXU closed 1 year ago
Hi, I think the problem is that you did not specify the full path to dbSequencesFasta and dbTaxonomyTsv files in metontiime2.conf file. In particular, you should edit these lines:
//Path to database file with sequences in fasta format
dbSequencesFasta="/path/to/sequence.fasta"
//Path to database file with sequence id-to-taxonomy correspondence in tsv format
dbTaxonomyTsv="/path/to/taxonomy.tsv"
Best, SM
Hi Simone,
Thank you for your prompt response. The importDb error got fixed. But I encountered another error with "assignTaxonomy". Here is the error information:
ERROR ~ Error executing process > 'assignTaxonomy (1)'
Caused by:
Process assignTaxonomy (1)
terminated with an error exit status (1)
Command executed: mkdir -p /home/zhaohui5/MetONTIIME/output/assignTaxonomy classifier_uc=$(awk '{print toupper($0)'} <<< Vsearch) if [ "$classifier_uc" == "BLAST" ]; then qiime feature-classifier classify-consensus-blast --i-query /home/zhaohui5/MetONTIIME/output/derepSeq/rep-seqs.qza rch_global', '/tmp/qiime--i-reference-reads /home/zhaohui5/MetONTIIME/output/importDb/dbsequences.qza ', '0.9', '--query--i-reference-taxonomy /home/zhaohui5/MetONTIIME/output/importDb/db_taxonomy.qza , '/tmp/qiime2/root/d--p-perc-identity 0.9 471-e36b3aefd1ca/dat--p-query-cov 0.8 a', '--threads', '6'--p-maxaccepts 3 ', '--maxhits', '3', --o-classification /home/zhaohui5/MetONTIIME/output/assignTaxonomy/taxonomy.qza --o-search-results /home/zhaohui5/MetONTIIME/output/assignTaxonomy/search_results.qza elif [ "$classifier_uc" == "VSEARCH" ]; then qiime feature-classifier classify-consensus-vsearch --i-query /home/zhaohui5/MetONTIIME/output/derepSeq/rep-seqs.qza --i-reference-reads /home/zhaohui5/MetONTIIME/output/importDb/db_sequences.qza --i-reference-taxonomy /home/zhaohui5/MetONTIIME/output/importDb/db_taxonomy.qza --p-perc-identity 0.9 --p-query-cov 0.8 --p-maxaccepts 3 --p-maxrejects 100 --p-maxhits 3 --p-strand 'both' lete command o--p-unassignable-label 'Unassigned' r and entering t--p-threads 6 ommand.out` --o-classification /home/zhaohui5/MetONTIIME/output/assignTaxonomy/taxonomy.qza --o-search-results /home/zhaohui5/MetONTIIME/output/assignTaxonomy/search_results.qza else echo "Classifier Vsearch is not supported (choose between Blast and Vsearch)" fi qiime metadata tabulate --m-input-file /home/zhaohui5/MetONTIIME/output/assignTaxonomy/taxonomy.qza --o-visualization /home/zhaohui5/MetONTIIME/output/assignTaxonomy/taxonomy.qzv
qiime taxa filter-table --i-table /home/zhaohui5/MetONTIIME/output/derepSeq/table.qza --i-taxonomy /home/zhaohui5/MetONTIIME/output/assignTaxonomy/taxonomy.qza --p-exclude Unassigned --o-filtered-table /home/zhaohui5/MetONTIIME/output/derepSeq/table-no-Unassigned.qza
Command exit status: 1 Command output: (empty) Command error: Plugin error from feature-classifier: Command '['vsearch', '--usearch_global', '/tmp/qiime2/root/data/28445d2c-8423-4836-acb0-a727cb003d56/data/dna-sequences.fasta', '--id', '0.9', '--query_cov', '0.8', '--strand', 'both', '--maxaccepts', '3', '--maxrejects', '100', '--db', '/tmp/qiime2/root/data/cf32145f-1f2d-4652-8471-e36b3aefd1ca/data/dna-sequences.fasta', '--threads', '6', '--output_no_hits', '--maxhits', '3', '--blast6out', '/tmp/q2-BLAST6Format-oecqcjf0']' died with <Signals.SIGKILL: 9>.
Could you please help me with this too?
Thank you, Shaelyn
Hi, if it took some time before giving the error, it could be a RAM memory issue. Try either increasing the amount of RAM for the process that failed in the conf file, or try running the pipeline on a cluster with more RAM available. SM
Hi Simone,
Yes, it did take some time before giving the error. Do you know what is the minimum RAM requirement for this pipeline?
Shaelyn
No, not precisely. It actually depends on the overall number of reads/ASVs and on the database size. Are you analysing many samples all at once? You may try either downsampling a maximum number of reads per sample (try with 10k reads, for example), or analysing samples in batches, or reducing the clustering identity parameter (try with 0.9, for example). SM
I used the demo data('Zymo ... fastq.gz)" for a test run and this error occurred. This errors still presents after decreasing the identify parameter (i tried 0.5 even...). My cluster shows there is around 25G free memory to use, would you think the memory is not enough for the demo data?
Shaelyn
25G should be enough for the demo data. But did you set that amount of RAM for the corresponding process in the conf file? SM
No, I didn't. Could you please specify which line should i modify in the conf file?
Line 131 or line 227, depending on whether you are using Docker or Singularity profiles. SM
Thanks, Simone! It worked after increasing the RAM.
Another error occurred at "diversityAnalyses" and it indicated "Command error: Plugin error from diversity: The rarefied table contains no samples or features. Verify your table is valid and that you provided a shallow enough sampling depth."
Could you please also help me with this?
Many thanks Shaelyn
Since the test dataset has 1k reads, you should decrease numReadsDiversity parameter. In any case, that error is expected to give an error when calculating beta-diversity, as it is a single samples. SM
Hi, I am going to close the issue. Feel free to reopen it in case you have further questions. SM
Hi Simone,
I followed the provided code to analyze the 16S sequencing data, however error occurred after running the code: "nextflow -c metontiime2.conf run metontiime2.nf --workDir="/home/zhaohui5/MetONTIIME/test/barcode01" --resultsDir="/home/zhaohui5/MetONTIIME/test/dir" -profile docker"
ERROR ~ Error executing process > 'importDb (1)'
Caused by: Process
importDb (1)
terminated with an error exit status (1)Command executed:
mkdir -p /home/zhaohui5/MetONTIIME/test/dir/importDb
qiime tools import --type 'FeatureData[Sequence]' --input-path sequence.fasta --output-path /home/zhaohui5/MetONTIIME/test/dir/importDb/db_sequences.qza
qiime tools import --type 'FeatureData[Taxonomy]' --input-path taxonomy.tsv --input-format HeaderlessTSVTaxonomyFormat --output-path /home/zhaohui5/MetONTIIME/test/dir/importDb/db_taxonomy.qza
Command exit status: 1 executor > local (2) [41/3a2647] process > importDb (1) [100%] 1 of 1, failed: 1 ✘ [- ] process > concatenateFastq - [- ] process > filterFastq - [- ] process > downsampleFastq - [- ] process > importFastq - [- ] process > derepSeq - [- ] process > assignTaxonomy - [- ] process > filterTaxa - [- ] process > taxonomyVisualization - [- ] process > collapseTables - [- ] process > dataQC - [- ] process > diversityAnalyses - ERROR ~ Error executing process > 'importDb (1)'
Caused by: Process
importDb (1)
terminated with an error exit status (1)Command executed:
mkdir -p /home/zhaohui5/MetONTIIME/test/dir/importDb
qiime tools import sema--type 'FeatureData[Sequence]' --input-path sequence.fasta --output-path /home/zhaohui5/MetONTIIME/test/dir/importDb/db_sequences.qza
qiime tools import -formats --type 'FeatureData[Taxonomy]' --input-path taxonomy.tsv --input-format HeaderlessTSVTaxonomyFormat --output-path /home/zhaohui5/MetONTIIME/test/dir/importDb/db_taxonomy.qza
Command exit status: 1
Command output: (empty)
Command error: Usage: qiime tools import [OPTIONS]
Options: --type TEXT The semantic type of the artifact that will be created upon importing. Use --show-importable-types to see what importable semantic types are available in the current deployment. [required] --input-path PATH Path to file or directory that should be imported. [required] --output-path ARTIFACT Path where output artifact should be written. [required] --input-format TEXT The format of the data to be imported. If not provided, data must be in the format expected by the semantic type provided via --type. --show-importable-types Show the semantic types that can be supplied to --type to import data into an artifact. --show-importable-formats Show formats that can be supplied to --input-format to import data into an artifact. --help Show this message and exit.
(1/1) Invalid value for '--input-path': Path 'sequence.fasta' does not exist. Work dir: /home/zhaohui5/MetONTIIME/work/41/3a264765dbb7860043c9ebbbb40772 Tip: when you have fixed the problem you can continue the execution adding the option
-resume
to the run command line -- Check '.nextflow.log' file for detailsCould you please provide some suggestions on this issue?
Thank you! Shaelyn