thej022214 / corHMM

Fits a generalized form of the covarion model that allows different transition rate classes on different portions of a phylogeny by treating rate classes as “hidden” states in a Markov process.
11 stars 13 forks source link

How to estimate memory usage with large data size #59

Open ajdemery93 opened 1 year ago

ajdemery93 commented 1 year ago

Hello,

I am trying to run corHMM on a large dataset (4518 spp) with four traits (3 binary, 1 categorical with ~10 states). I've successfully run through the tutorial vignettes, and successfully ran corHMM with a subset of my data (119 spp., same number of traits, the categorical having 4 states). On my bigger dataset on a machine with 24 cores, I get the "bus error (core dumped)" message.

My question is, how can I estimate the memory usage given my full data size so that I know the number of cores and time I need to successfully run the analysis. I appreciate your help, and I'm happy to answer further questions.