Open fishbacp opened 2 years ago
Have you solved the problem?
Hi there,
The error is that observations should not contain negative values; I believe the reason for this error is that we are enumerating the states of a system, so a negative value is unexpected. More specifically, the negative value will throw off an attempt to calculate the base of the logarithm.
If your observations contain negative values, the solution is simply to remap the observed values to the positive integers using pyinform.utils.coalesce_series
.
For example:
xs = [0,-1,-1,-1,-1,0,0,0,0]
ys = [0,0,1,1,1,1,0,0,0]
coal_xs, b = pyinform.utils.coalesce_series(xs)
transfer_entropy(coal_xs,ys,k=2)
returns the correct answer:
0.6792696431662097
In general, all that matters for information-theoretic calculations is the distribution of states and not the actual value of the states. So the entropy of xs = [-1, -1, 0] is the same as [+1, +1, 0] since they both result in the probability distribution [2/3, 1/3].
One last thing to note is that your observations should be comprised of discrete states. If you are working with continuous-valued observations, you will want to bin these observations first using pyinform.utils.binning
.
I have two simple time series, xs and ys, having 5000 samples each. I attempted to compute the transfer entropy via
T=transfer_entropy(xs, ys, k)
using various lag values, k. Each attempt yielded the following error message:
Any insights as to the error source? Should I be adjusting other keyword arguments?