ddarriba / modeltest

Best-fit model selection
GNU General Public License v3.0
73 stars 21 forks source link

Floating point exception (core dumped) #53

Open WongEB opened 2 years ago

WongEB commented 2 years ago

Hi,

I am running modeltest-ng to get the best-fit model for raxml-ng. Out of the 30 genes in my file, one of them was aborted half way in the run and received error "Floating point exception (core dumped)" from modeltest-ng.

Here is the command I was using: modeltest-ng -d aa -i C0256.laln -o C0256.laln.mt -ff

I saw previous discussion suggested to use only single core for the run (I tried -p 1), but the error still persisted in my case.

I tried a few parameter options on the run and it happened to stop when it is supposed to calculate model+I+G4. As according to the manual, modeltest-ng was considering both the proportion of invariant sites (I) and gamma rate categories (G) in searching the best model with optimum rate heterogeneity by default. So I tested a few times to include I and G option separately in the run. Surprisingly, my file was able to run properly.

However, I would prefer to obtain the best-fit model by including the calculation of both I and G for my sequences or at least understand further on what was happening in the run before opting to either I or G (just in case I have to). Floating point exception sounds something like zero division error to me. Does anybody have any idea on the error and what I should do about it?

Thanks! Bhei

ddarriba commented 2 years ago

Hi Bhei,

The bug related to the multi-threaded version was already fixed. I think I would need to take a look at your data. Could you please send me the input files to diego.darriba (at) udc (dot) com, or attach a minimal example that does not work?