Open ckrapu opened 3 years ago
There is some code example of Vector Autoregression here: https://www.pymc-labs.io/blog-posts/bayesian-vector-autoregression/
However, I am unable to determine the role of the constant argument and why it necessitates the calculation of
eps = value[self.p :] - self.rho[0] - x
The constant is just an intercept term: y = rho[0] + x
, where x
are the convolved values and lagged coefficients from rho[1:]
. The AR was refactored to V4 in #5734
The
AR
distribution appears to be nearly complete for usage as a true vector autoregression parameterized byp
cross-series coefficients, each of shape(d,d)
. The main change that has to be enacted is to use a dot product instead of elementwise multiplication here. However, I am unable to determine the role of theconstant
argument and why it necessitates the calculation ofeps = value[self.p :] - self.rho[0] - x
where, under the AR / VAR model,eps
is assumed to have a diagonal Normal distribution.