JuliaDynamics / ChaosTools.jl

Tools for the exploration of chaos and nonlinear dynamics
https://juliadynamics.github.io/DynamicalSystemsDocs.jl/chaostools/stable/
MIT License
186 stars 35 forks source link

Numerical computation of Kolmogorov Sinai entropy #6

Open Datseris opened 6 years ago

Datseris commented 6 years ago

EDIT: solution for this issue exists in the book by Schuster and Just, equation 6.107.

EDIT2: solution for this issue exists also in the book by Kantz and Schreiber, section 11.4.5

EDIT3: solution for this also exists in P. Grassberger and I. Procaccia, “Estimation of the Kolmogorov entropy from a chaotic signal,” Phys. Rev. A, vol. 28, no. 4, pp. 2591–2593, 1983. They show how to compute a quantity approximately equal to KS.


From @Datseris on October 2, 2017 16:36

Another commonly used entropy is the "KS-entropy".

I would be reluctant to point people to the original papers (Ya.G. Sinai, On the Notion of Entropy of a Dynamical System, Doklady of Russian Academy of Sciences, (1959), 124, 768-771. and A.N. Kolmogorov, Entropy per unit time as a metric invariant of automorphism, Doklady of Russian Academy of Sciences, (1959), 124, 754-755.) as they are veeery mathematical and were written before computers became a thing. I also do not know any paper that treats the subject (I haven't searched for one!).

I think @RainerEngelken has done this, maybe he can help us a bit here with some comments.

Copied from original issue: JuliaDynamics/DynamicalSystems.jl#28

--- Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/53064670-numerical-computation-of-kolmogorov-sinai-entropy?utm_campaign=plugin&utm_content=tracker%2F81670622&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F81670622&utm_medium=issues&utm_source=github).
tkf commented 6 years ago

Are you talking about the Pesin's identity/inequality? I think in practice people just assume hyperbolicity and calculate the sum of positive Lyapunov exponents.

http://www.scholarpedia.org/article/Pesin_entropy_formula

Datseris commented 6 years ago

I am talking about the Kolmogorov Sinai entropy: http://www.scholarpedia.org/article/Kolmogorov-Sinai_entropy

From what I recall from the lectures I have attended:

You define a partition size ε and get the Shannon entropy at orbit of length n. Then you get the shannon entropy at orbit of length n+1 and define the KS entropy as:

KS = lim(ε -> 0), lim(N -> oo) (1/N)*Sum(Hn+1 - Hn), the sum being from n=0 to N-1.

I wish github would allow latex, god damn. This is from the book of Schuster "Deterministic Chaos".

I looked at the page you cited, but the word "Kolmogorov" is not even mentioned on the entire page, so I am sceptical as to whether there is some straight-forward connection.

Datseris commented 6 years ago

Alright after some search apparently they are connect, but I still don't see a straightforward way of computing them.

Using just the sum of positive lyapunovs is not enough of course :P people can already do it with the lyapunovs function.

tkf commented 6 years ago

Oh, I didn't realize that the Scholarpedia entry of the Pesin's formula didn't mention KS entropy by name. Not a good reference indeed.

Using just the sum of positive lyapunovs is not enough of course :P

Why do you think so? I don't think you want to directly estimate KS entropy, because of combinatorial explosion of the pre-images of the partitions you have to track.

Checkout, e.g., Eq (4.4) and the equation next to it from Eckmann & Ruelle (1985) and (A3) from Hunt & Ott (2015). I think this is also how people compute in practice: e.g., Monteforte Wolf (2010).

PS: Yeah, github should support equations...

tkf commented 6 years ago

Hmm... But not impossible even without the Lyapunov exponents? http://iopscience.iop.org/article/10.1209/epl/i2005-10515-2/meta

RainerEngelken commented 6 years ago

Just as an addition: A dynamical system does not need to be hyperbolic for the Pesin's identity to be fulfilled, but that is indeed a sufficient condition. Pesin's identity is fulfilled if and only if the dynamical system is endowed with an ergodic SRB-measure, which means briefly that the attractor has smooth densities along the unstable manifolds. This was shown in this paper: https://www.math.nyu.edu/~lsy/papers/metric-entropy-part1.pdf Here is a review on SRB-measures and which dynamical systems have them: https://cims.nyu.edu/~lsy/papers/SRBsurvey.ps

tkf commented 6 years ago

@RainerEngelken Cool! Thanks for filling the important detail. (BTW, I like your JuliaCon talk!)

Datseris commented 6 years ago

Hi people! First of all let me just say that I am very glad that anybody else besides myself wants to improve these libraries!

A bit off-topic, but let me also say that unfortunately in the coming months I won't have any time to make any improvements because I need to focus on my phd... :( However, I promise that if anybody wants to make a PR I will definitely review it and merge it asap; I can also make you collaborators.

Alrighty, now on-topic. I am not familiar with SRB and the papers Rainer cited, but it is much appreciated! If at somebody somebody wants to contribute something for this issue they can take advice.

I think we can then say that this method is a "low-priority"? Since Takafumi pointed out that in research people often use sum of maximum λs? @RainerEngelken how did you do it for the neural system you had? Computed the sum as well or did a method to compute the entropy directly?

Datseris commented 6 years ago

In addition, a warning/heads-up:

If you want to do a PR, please consider doing it after this issue is closed: https://github.com/JuliaDynamics/DynamicalSystemsBase.jl/issues/18 so that you don't have to rewrite anything again and again. It will be a massive change of the internals so pretty much all code that doesn't use Dataset will be affected.

tkf commented 6 years ago

Yeah, I think this issue is low-priority, too. The breaking changes propagated from DifferentialEquations.jl sound much more important.

PS: Good luck on your PhD!

Datseris commented 3 years ago

This paper:

P. Grassberger and I. Procaccia, “Estimation of the Kolmogorov entropy from a chaotic signal,” Phys. Rev. A, vol. 28, no. 4, pp. 2591–2593, 1983

has a straight-forward numeric algorithm on how to calculate an approximately equal quantity to KS-entropy (that they call K_2)

Datseris commented 3 years ago

solution for this issue exists in the book by Schuster and Just, equation 6.107