Open nick-youngblut opened 3 years ago
I received a similar error from another DAS-tool job:
#CPU threads: 256
Scoring parameters: (Matrix=BLOSUM62 Lambda=0.267 K=0.041 Penalties=11/1)
Database file: /ebio/abt3_projects/Georg_animal_feces/data/metagenome/HiSeqRuns-n11/LLMGA/v0.12/cluster2/bin_refine/X238_Hanuman_Langur/DAS_Tool/bins_proteins.faa
Opening the database file... [0.002661s]
Loading sequences... [1.06957s]
Masking sequences... [26.7932s]
Error: pthread_create error: Insufficient resources to create another thread, or a system-imposed limit on the number of threads was encountered.
makeblastdb did not work for /ebio/abt3_projects/Georg_animal_feces/data/metagenome/HiSeqRuns-n11/LLMGA/v0.12/cluster2/bin_refine/X238_Hanuman_Langur/DAS_Tool/bins_proteins.faa, please check your input file
single copy gene prediction using diamond failed. Aborting
It appears that DAS-tool (at least makeblastdb
) is trying to use more threads that what was provided via the qsub job
Interesting. I'm not sure what causes the issue here. However, multiple users had problems with the single copy gene prediction step lately. I'll have to refactor some parts and maybe get rid of the ruby dependency.
Did you define 12 threads in the qsub command (or script header) as well as in the DAS_Tool command? I'm curious where the 256 comes from.
DAS-tool 1.1.2 generated the following error:
The
arc.all.faa
contains 198238 sequences, and appears to be a valid fasta. The entire log: bins_DASTool.log