Open rmcminds opened 6 years ago
option 3 might additionally present another nice method to incorporate an estimate for codivergence. In theory, if a pair of species are perfectly codiverging, neither the rate of evolution nor the time since last divergence should have any effect on the variance. That is, all branch lengths should be identical, and a clade with more divergences will thus have more overall variance. So, a model could have a baseline ultrametric tree-based variance structure, and two other trees (with branch lengths representing the rate of molecular evolution or with uniform branch lengths) whose influence is estimated during fitting. The model would then explicitly compare the relative importance of time, molecular evolution, and vicariance.
currently, i've written the model to explicitly use a chronogram or ultrametric tree, which makes interpretation of many terms easier. However, if a particular clade has had more rapid molecular evolution, as would be shown using a non-ultrametric tree, it is reasonable to ask whether the increase in molecular evolution is correlated with an increase in the rate of evolution of the traits assessed in our model. If so, a non-ultrametric tree would provide more accurate estimates of variance than a forcibly ultrametricized one.
Two ideas come to mind to implement this: (1) simplify the model and remove the time bins, at which point any tree could be used as input; (2) calculate the proportion of each branch that belongs to each time bin using an ultrametricized tree, but then fill the edgeToBin matrices with the raw branch lengths from the original tree; or (3) incorporate the original branch lengths into a model matrix that acts on variance parameters, and estimate another metaparameter for the correlation between those original branch lengths and the optimum variance.