MattWillFlood / EntropyHub

An open-source toolkit for entropic data analysis.
https://www.entropyhub.xyz
Apache License 2.0
146 stars 19 forks source link

Negative Approximate Entropy #11

Closed JAC28 closed 2 months ago

JAC28 commented 3 months ago

Although the Approximate Entropy is always described as a non-negative quantity, the implemented method can become negative in extreme cases, as the attached example in Python shows.

import numpy as np
import EntropyHub as EH

# Set the seed to ensure reproducibility
np.random.seed(42)

# Generate an array of 100 random numbers
rnd_data = np.random.rand(100)

print(EH.ApEn(rnd_data, m=2, r=0.2*np.std(rnd_data))[0][-1])
print(EH.ApEn(rnd_data[:12], m=2, r=0.2*np.std(rnd_data[:12]))[0][-1])

The result is:

0.6736246486917667
-0.09531017980432521

As far as I understand the algorithm correctly, this behaviour is mathematically correct and occurs if there are only self-matches, since for example for $N=12$ the elements $C_i^m$ are each $1/(N-m+1)$ and $C_i^{m+1}$ are each $1/(N-m+2)$. The corresponding logarithms are $-2.39789527$ and $-2.30258509$, making the difference $\phi^m-\phi^{m+1}$ negative. The implementation in the package antropy leads to the same results.

The whole thing is a very niche edge case and no longer occurs with the example data from $N=15$. Nevertheless, a note/ warning to the user would be useful in these cases.

MattWillFlood commented 2 months ago

Hi @JAC28 🙂, Thank you for highlighting this very astute observation 💡💡 We will duly add a warning message to alert uses in these rare cases 👍🏼 Thanks again 🙏🏼