by definition regression coefficients are multivariate normal distributed as
beta_hat ~ N_p(beta, (X^TX)^{-1}sigma^2)
We want to draw random coefficients to induce uncertainty in model coefficient values
[x] calculate the covariance matrix (X^TX)^{-1}sigma^2. already calculated for most models using vcov() command. check for clm etc.
[x] put in the pipeline somewhere that model fixed effects coefficients are nudged by a random draw from the N_p(0, (X^TX)^{-1}sigma^2). maybe call a function in presetup or as soon as model is called. may be simpler to just add this in every time step so we're using rpy2 as little as possible.
[x] test that this doesn't produce stupid results.
This whole approach may be done much better using Gibbs sampling I think. check if theres an R package to do this for regression. start simple for now.
by definition regression coefficients are multivariate normal distributed as
beta_hat ~ N_p(beta, (X^TX)^{-1}sigma^2)
We want to draw random coefficients to induce uncertainty in model coefficient values
This whole approach may be done much better using Gibbs sampling I think. check if theres an R package to do this for regression. start simple for now.