Closed CamillePhilippeSCI closed 5 months ago
Hi @CamillePhilippeSCI , Thank you very much for using the workflow! Could you paste the logs to be able to take a look?
Hi!
Thanks for your quick answer !!
I am using epi2me labs on a computer that i don't have access for now,
Ill send you that tomorrow,
I was wondering if it would be possible as the Barcoding kit only goes to 96 barcodes?
Maybe i should try in command lines instead of in the program ?
Thanks
Hi @CamillePhilippeSCI , The workflow run in EPI2ME Desktop Application is the same that in the command line. We support 96 barcodes, however I have made a small test with 104 samples and the wf run until completed. Could you paste the error you're getting? Thank you very much! Natalia
Hi @nggvs,
Thanks for your answer, sorry i haven't send it yet, ill do that probably next week, What name did you give to the samples ? Barcode 1 to Barcode 104 ?
thanks :)
I did yes, but send me the error and the params you're using to have a better idea of what could be happening
Hi @nggvs,
Here are the params : { fastq : /media/minion_viro/LaCie/bc/bctest classifier : kraken2 port : 8080 host : localhost server_threads : 2 kraken_clients : 2 database_set : SILVA_138_1 store_dir : store_dir taxonomic_rank : S abundance_threshold : 1 n_taxa_barplot : 9 out_dir : /home/minion_viro/epi2melabs/instances/wf-16s_38afd3f9-92f0-4143-a727-778839b129d2/output min_len : 800 max_len : 2000 threads : 4 wf.agent : epi2melabs/4.1.1 }
And the error :
[50/8b5e01] Submitted process > kraken_pipeline:createAbundanceTables
[53/405871] Submitted process > kraken_pipeline:output_results (3)
[d9/d0d12d] Submitted process > kraken_pipeline:makeReport (1)
Error executing process > 'kraken_pipeline:makeReport (1)'
Caused by:
Process kraken_pipeline:makeReport (1)
terminated with an error exit status (137)
Command executed:
workflow-glue report "wf-16s-report.html" --workflow_name wf-16s --versions versions --params params.json --read_stats read_stats/ --lineages lineages --abundance_table "abundance_table_genus.tsv" --taxonomic_rank "G" --pipeline "kraken2" --abundance_threshold "1" --n_taxa_barplot "9"
Command exit status:
137
Command output:
(empty)
Command error:
[11:29:02 - workflow_glue] Could not load abundance_tables due to missing module anytree
[11:29:12 - matplotlib.font_manager] generated new fontManager
[11:29:13 - workflow_glue] Starting entrypoint.
.command.sh: line 2: 27 Killed workflow-glue report "wf-16s-report.html" --workflow_name wf-16s --versions versions --params params.json --read_stats read_stats/ --lineages lineages --abundance_table "abundance_table_genus.tsv" --taxonomic_rank "G" --pipeline "kraken2" --abundance_threshold "1" --n_taxa_barplot "9"
Work dir:
/home/minion_viro/epi2melabs/instances/wf-16s_38afd3f9-92f0-4143-a727-778839b129d2/work/d9/d0d12daac674a26900b529b21024d9
Tip: you can replicate the issue by changing to the process work dir and entering the command bash .command.run
Could it be because i ask for specie and not genus in the rank ?
Thanks !! :)
Hi @CamillePhilippeSCI , The error 137 is usually due a lack of memory. If you're running the workflow through the Desktop Application, you can overwrite the memory assigned to the process by using a custom nextflow config. In the app you can do it during setting the workflow: go to Nextflow Configuration and paste this in Configuration:
process {
withName: makeReport{
memory = 8.GB
}
}
You can use a higher value depending on your device.
Let me know if this works for you!
Hi @nggvs,
Thanks a lot ! it works :)
Camille
Glad to hear that! Please let us know any other issue you find. Thank you for using the workflow and the app!
Ask away!
Dear Community,
I have been sequencing 144 samples in 6 flowcells, I have in total 144 Barcodes. When running this workflow I am only able to run 96BC at the time. When doing the kraken with 96BC, it says "stopped with error" but i still have the taxonomic table and that is great because i need that file. I would like to have a complete table with all 144 samples in it, would there be a way of doing so ?
Thankyou !
Camille :)