Open donotdespair opened 8 months ago
Development progress of Giannone, Lenza & Primiceri (2015, RESTAT) in branch develop-glp
specify_*
function and estimate
method for the particular models by #14Hey @adamwang15
I have finished work with specify_prior_bsvarSIGN
. It is working well and passes all the relevant tests in inst/tinytest/test_specify.R
. But it crashes other parts of the algo and I haven't yet introduced these changes. That's the upcoming task for the both of us!
Cheers, T
Great! (I think) I fixed the crash and now the code runs as usual, I will keep experimenting the GLP code!
Hey @adamwang15 Amongst the Adaptive Metropolis algorithms we should try for the GLP model hyper-parameters sampler we should try the one by Vihola (2012). It is robust even for large dimensions of the candidate sampling density. There is an R package ramcmc implementing the RAM and it's written using RcppArmadillo. It is nice, small, and without other dependencies than Rcpp and RcppArmadillo. So, we could use it directly!
There's just one point we need to investigate: The RAM is designed for symmetric candidate generating densities. I am not sure how it'd work if one needs to use truncation (to avoid sampling negative variance coefficients). I'm investigating this.
BTW, you were right about the meaning of $\alpha_i$. Nice! And also, that's much simpler than what I was coding!
Cheers, T
Hey @donotdespair thanks! Okay I missed this one! In the meantime, I coded an adaptive Metropolis algorithm and it seems to be working. This answer suggests log-transformation to sample positive parameters. I will try bringing in the ramcmc
code, looks like it can be adjusted to include log-transformation as well.
ramcmc
for proposal variance in the Metropolis algorithm
Papers we should implement
SVARs
BVARs
Papers that are useful
Parallel computing