One useful value to compute is relative entropy. We already have relative entropy implemented on distributions (commit 27155396d252d8fc9c458e18cc9cabc6a9b70abb). It would be to construct a distribution of each of two timeseries and compute the relative entropy at one fell swoop.
Note that a local measure of relative entropy may not be so well defined on timeseries. Unlike the other local measures so far implemented, averaging the local relative entropy in the naive way will not generally return the global relative entropy. This is because the average is to be taken of the posterior distribution, not the joint distribution. This point should be discussed further.
Proposed API:
double inform_relative_entropy(int const *xs, int const *ys, size_t n,
int bx, int by, double b, inform_error *err);
Example Usage:
#define N 9
int xs[N] = {0,0,1,1,1,1,0,0,0};
int ys[N] = {1,0,0,1,0,0,1,0,0};
inform_error err = INFORM_SUCCESS;
inform_relative_entropy(xs, ys, N, 2, 2, 2.0, &err); // == 0.038330
inform_relative_entropy(ys, xs, N, 2, 2, 2.0, &err); // == 0.037010
One useful value to compute is relative entropy. We already have relative entropy implemented on distributions (commit 27155396d252d8fc9c458e18cc9cabc6a9b70abb). It would be to construct a distribution of each of two timeseries and compute the relative entropy at one fell swoop.
Note that a local measure of relative entropy may not be so well defined on timeseries. Unlike the other local measures so far implemented, averaging the local relative entropy in the naive way will not generally return the global relative entropy. This is because the average is to be taken of the posterior distribution, not the joint distribution. This point should be discussed further.
Proposed API:
Example Usage: