JuliaDynamics / ComplexityMeasures.jl

Estimators for probabilities, entropies, and other complexity measures derived from data in the context of nonlinear dynamics and complex systems
MIT License
48 stars 11 forks source link

Implement fluctuation complexity #409

Closed kahaaga closed 3 weeks ago

kahaaga commented 3 weeks ago

What's this?

The fluctuation complexity characterizes the deviations of the information content of individual states from some summary statistic (information measure) computed for the same distribution. I've here generalized the original measure, which uses Shannon entropy, to be compatible with any InformationMeasure.

Code changes

codecov[bot] commented 3 weeks ago

Codecov Report

Attention: Patch coverage is 90.00000% with 1 line in your changes missing coverage. Please review.

Project coverage is 91.97%. Comparing base (faa492a) to head (f1464dd). Report is 18 commits behind head on main.

Files Patch % Lines
...tion_measure_definitions/fluctuation_complexity.jl 90.00% 1 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #409 +/- ## ========================================== + Coverage 89.29% 91.97% +2.67% ========================================== Files 79 86 +7 Lines 2271 2468 +197 ========================================== + Hits 2028 2270 +242 + Misses 243 198 -45 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

Datseris commented 3 weeks ago

Is this really an Information measure or a ComplexityEstimator? Because it sounds like the latter. What axiomatic definition of an information measure does this satisfy? It isn't an entropy, or extropy, as far as I can tell.

kahaaga commented 3 weeks ago

Is this really an Information measure or a ComplexityEstimator? Because it sounds like the latter. What axiomatic definition of an information measure does this satisfy? It isn't an entropy, or extropy, as far as I can tell.

It is a functional of a PMF, so it is an information measure according to our definition.

Skjermbilde 2024-06-07 kl  22 50 13

We do not impose and demands on the axiomatic foundation of a measure in our API for it to quality as an information measure. Only that it is a functional of a PMF. Entropies and entropies are examples of information, measures, but not the only ones. The source of this method themselves call it a "information complexity measure":

Skjermbilde 2024-06-07 kl  22 54 38

I think we should stick with this simple distinction: non-probabilities-based or probabilities-based. Upstream, I've also named all measures that are functions of probabilities "information measures" (KL divergence, relative entropy, mutual info, etc) too, and they are computed using the information function, just with more input datasets.

kahaaga commented 3 weeks ago

To add: this is an information measure in our framework because it is something that can be estimated using any outcome space, any probabilities estimator and any generic information measure estimator (as a consequence of being a probabilities functional):

information(Jackknife(FluctuationComplexity()), BayesianRegularization(), OrdinalPatterns(), x) 

whereas things like SampleEntropy cannot be estimated in this way.

Datseris commented 3 weeks ago

okay, sounds good. distinction based on "functional of PMF or PDF" is fine for me.

kahaaga commented 3 weeks ago

okay, sounds good. distinction based on "functional of PMF or PDF" is fine for me.

Good, then we're on the same page 👍