MaestSi / MetONTIIME

A Meta-barcoding pipeline for analysing ONT data in QIIME2 framework
GNU General Public License v3.0
78 stars 17 forks source link

missing fromPath parameter error when trying test data set #101

Closed irc47 closed 6 months ago

irc47 commented 6 months ago

Hi, I'm a new user and trying to test out the workflow with the test dataset, but I'm having a very hard time getting the configuration file set up. I continue to get the error missing 'fromPath' parameter. I saw this conversation from 2023- https://forum.qiime2.org/t/error-using-metontiime-missing-frompath-parameter/28345 - and tried making the suggested changes but without success.

One thing that would be helpful would be a sample filled-in .conf file alongside a recommended file structure. Right now I've made my working directory inside the MetONTIIME folder because I though that might solve the issue, but I'm not sure if that's a good idea.

Alternatively, I've attached the file I made and hopefully you'll be able to help me see what I'm doing wrong. Thanks! -Ilana

metontiime2.conf.txt

MaestSi commented 6 months ago

Hi, first of all, all paths should be absolute paths and not relative paths (i.e. they should not start with ./). Second, dbSequencesQza and dbTaxonomyQza should only contain the name of the file, without the full path, so you should write:

//Name of database file with sequences as QIIME2 artifact (qza); if it is already available, it should be put in resultsDir/importDb
dbSequencesQza="2022.10.backbone.full-length.fna.qzv"
//Name of database file with sequence id-to-taxonomy correspondence as QIIME2 artifact (qza); if it is already available, it should be put in resultsDir/importDb
dbTaxonomyQza="2022.10.taxonomy.asv.tsv.qza"

Third, dbSequencesFasta and dbTaxonomyTsv should point to a real file, for example you may write the full path to an empty file. Best, SM

irc47 commented 6 months ago

Thank you, these were helpful pointers. I made those changes and solved it solved the problem. However, the pipeline ran up to the assign taxonomy step but then ended with an error that "Classifier Blast is not supported". I've also tried with Vsearch and got the same error. Is there something else I'm doing wrong?

The dataset is the test data provided and the config and nextflow files are below:

irc_nextflow_may17.txt metontiime2.conf.txt

MaestSi commented 6 months ago

Hi, sorry for the late reply. This is the error:

Command error:
  Saved BLASTDB to: /home/irc2/MeTONTIIME_Test/results//importDb/blastIndexedDb.qza
  .command.sh: line 14:    40 Killed                  qiime feature-classifier classify-consensus-blast --i-query /home/irc2/MeTONTIIME_Test/results//derepSeq/rep-seqs.qza --i-blastdb /home/irc2/MeTONTIIME_Test/results//importDb/blastIndexedDb.qza --i-reference-taxonomy /home/irc2/MeTONTIIME_Test/results//importDb/2022.10.taxonomy.asv.tsv.qza --p-num-threads 6 --p-perc-identity 0.9 --p-query-cov 0.8 --p-maxaccepts 3 --p-min-consensus 0.7 --o-classification /home/irc2/MeTONTIIME_Test/results//assignTaxonomy/taxonomy.qza --o-search-results /home/irc2/MeTONTIIME_Test/results//assignTaxonomy/search_results.qza

Usually, I see "killed" error when the amount of RAM memory is not enough. I saw in your conf file (line 133 or 229, depending on whether you are using Docker or Singularity profile) that only 10 GB are available for the assignTaxonomy process: memory = { params.assignTaxonomy ? 10.GB + (2.GB * (task.attempt-1)) : 1.GB } You may try increasing this value to 60. GB, for example, if you have them available (or use a smaller database). Let me know if this solves the issue. Best, SM

irc47 commented 6 months ago

Hi, Thank you very much, I think that perhaps my database files were corrupted because once I re-downloaded them I was able to get it to run on the test dataset without increasing the RAM. -Ilana

MaestSi commented 6 months ago

Glad it worked! Best, Simone