Open mathause opened 2 years ago
Removing from the milestone again. This remains a not-critical good first issue.
I suspect that (2) & (3) are mutually exclusive.
For (5) this could be dispatched in draw_auto_regression_correlated
(which still has to be written)
For (5) we could a separate function for draw innovations, which would make the whole thing more intuitive. However, as mentioned, this was painstakingly unified, I'll look through the whole autoregression part, maybe I can come up with something nice, but as you said, it's not critical.
The function
_draw_auto_regression_correlated_np
can be optimized to run faster - there are some fun optimizations that can be made.np.sum(a * b)
we can usenp.einsum
which should be about twice as fast:https://github.com/MESMER-group/mesmer/blob/0c5b26569c8f0929d20c84448e7c8dc3635aff25/mesmer/core/auto_regression.py#L72
innovations
asout
. I think we don't actually need to create anout
variable. This will avoid using twice the memory and should be a bit faster. To make it less confusing we can just do a shallow copy of theinnovations
:https://github.com/MESMER-group/mesmer/blob/0c5b26569c8f0929d20c84448e7c8dc3635aff25/mesmer/core/auto_regression.py#L69
intercept
to theinnovations
. Theintercept
is added in each pass - this may be suboptimal. However, we need to be careful about the buffer period.https://github.com/MESMER-group/mesmer/blob/0c5b26569c8f0929d20c84448e7c8dc3635aff25/mesmer/core/auto_regression.py#L74
Advanced optimizations:
n_coefs=1
could be sped up by (i) usingout[:, t - ar_lags] @ ar_coefs
(even faster thaneinsum
) andnp.random.normal
instead ofnp.random.multivariate_normal
. However, I now unified the two function deliberately to make it easier to understand.