cmks / DAS_Tool

DAS Tool
Other
140 stars 17 forks source link

DAS Tool not generating refined bin files #93

Open rialc13 opened 1 year ago

rialc13 commented 1 year ago

Hi. I am running the DAS Tool with the --score_threshold 0.4 and it is unable to generate refined bin files. In the bins folder, it is reporting a single file called .fa which is a blank file. In case binning is not taking place, shouldn't atleast an unbinned.fa file be reported? And is there any way that the blank .fa file still have the output base name as mentioned while running the command? Here is the log file -

DAS Tool 1.1.4

2023-01-30 03:57:38

Parameters: --bins MEGAHIT-MaxBin2-ERR2538413.tsv --contigs MEGAHIT-ERR2538413.contigs.fa --outputbasename MEGAHIT-DASTool-ERR2538413 --labels NULL --search_engine diamond --proteins NULL --write_bin_evals TRUE --write_bins TRUE --write_unbinned TRUE --threads 2 --score_threshold 0.4 --duplicate_penalty 0.6 --megabin_penalty 0.5 --dbDirectory db --resume FALSE --debug FALSE --version FALSE --help FALSE --create_plots FALSE

Dependencies: prodigal /usr/local/bin/prodigal diamond /usr/local/bin/diamond pullseq /usr/local/bin/pullseq ruby /usr/local/bin/ruby usearch blastp /usr/local/bin/blastp

Analyzing assembly Predicting genes Annotating single copy genes using diamond Dereplicating, aggregating, and scoring bins No bins with bin-score >0.4 found. Adjust score_threshold to report bins with lower quality. Aborting.

Writing bins

cmks commented 1 year ago

Hi @rialc13, This bug has been already fixed. If you switch to the latest version (https://github.com/cmks/DAS_Tool/releases/latest), DAS Tool will not attempt to write any bins if no bins were found.

Technically, you are right and an "unbinned.fa" file could be generated in this case containing all contigs of the input fasta file.

rialc13 commented 1 year ago

Hi @cmks, I updated the tool version to 1.1.6 and the earlier problem is solved. Thanks! However, I still would recommend generating an unbinned.fa file in such cases.

davidecrs commented 11 months ago

Hi @cmks,

I'm getting the same error despite having version 1.1.6.

Are there some mandatory options to avoid getting this error?

Best regards