JuliaDynamics / ComplexityMeasures.jl

Estimators for probabilities, entropies, and other complexity measures derived from data in the context of nonlinear dynamics and complex systems
MIT License
48 stars 11 forks source link

Fluctuation complexity, restrict possibilites to formally defined self-informations #413

Open kahaaga opened 3 weeks ago

kahaaga commented 3 weeks ago

What's this?

Here I address #410 and restrict the fluctuation complexity to information measures for which it is possible to define "self-information" in the following sense.

Given an information measure H, I define the "generalized" self-information as the functional I(p_i) that allows us to re-write the expression for H as a probability-weighted sum H = sum_i (p_i I(p_i)) ( a weighted average, but since sum(p_i) = 1, the denominator in the weighted average doesn't appear explicitly).

Next, the fluctuation complexity is the square root of sum_{i=1}^N p_i(I(p_i) - H)^2). Hence, using the formulation above, we can meaningfully speak about a fluctuation of local information around the mean of information, regardless of which measure is chosen.

I also require that the generalized self-information will yield a fluctuation complexity that have the same properties as the original Shannon-based fluctuation complexity:

Note that we don't involve the axioms which Shannon self-information fulfill at all: we only demand that the generalized self-information is the functional with the properties above. I haven't been able, at least until now, to find any papers in the literature that deals with this concept for Tsallis or other generalized entropies, so I think it is safe to explore with this naming convention.

New design

Progress

I've made the necessary derivations for the measures where calculations looked easiest: Shannon entropy/extropy, Tsallis entropy and Curado entropy. I'll fill in the gaps for the rest of the measures whenever I get some free time.

I'm writing this all up in a paper, where I also highlight ComplexityMeasures.jl and how easy it is to use the measure practically due to our discrete estimation API. I've essentially finished the intro and method, but the experimental part remains to be done. For that, I need functional code. So before I proceed, I'd like to get your input on this code proposal, @Datseris. Does this dispatch-based system make sense?

Pending the paper, I verify correctness by numerical comparison in the test suites. I re-write the information measures as weighted sums involving self_information, and check that we obtain the same value as if computing the measure using the traditional formulations.

Datseris commented 2 weeks ago

This all sounds good to me, however I dislike the term self_information. I don't understand at all where the "self" refers to. The correct way to call this quantity would be information. But we have used this name already for the average information in a whole signal instead. surprisal is okay. In our textbook we just call this surprise. But perhaps an overall better is unit_information?

Datseris commented 2 weeks ago

Oh, just now I saw the wiki https://en.wikipedia.org/wiki/Information_content

Right, I would still stay away from self, but we can use information_content instead.

kahaaga commented 2 weeks ago

Oh, just now I saw the wiki https://en.wikipedia.org/wiki/Information_content

Right, I would still stay away from self, but we can use information_content instead.

I'm fine with whatever term we use, as long as it is rooted in common literature usage. We can go for information_content.👍

kahaaga commented 2 weeks ago

@Datseris I'm also a bit hesitant about the name FluctuationComplexity. I mean: yes - it is a complexity measure in the sense that it uses entropies & friends. However, I think a better, more general name is InformationFluctuation, which is more in line with the fact that we define information_content.

This goes a bit against the terminology used in the literature, but I think it is more precise. What do you think?

EDIT: or maybe just Fluctuation(measure::InformationMeasure). Then it is implicit that it is fluctuation of information, since it takes as input an InformationMeasure

kahaaga commented 2 weeks ago
Skjermbilde 2024-06-18 kl  12 59 04

This is how it will look in the paper if using Fluctuation. I think this is nice and clean syntax.

Datseris commented 2 weeks ago

We can keep FluctuationComplexity and have it as a complexity measure with reference to the literature article. It dispatches to InformationFluctuation with Shannon. The name InformationFluctuation can do the generic thing you attempt to do in this PR and should cite your paper once you are done with it.

Datseris commented 2 weeks ago

I don't like the generic Fluctuation because it is too generic. Fluctuation Dissipation theorems for example.