Closed kahaaga closed 3 weeks ago
Attention: Patch coverage is 90.00000%
with 1 line
in your changes missing coverage. Please review.
Project coverage is 91.97%. Comparing base (
faa492a
) to head (f1464dd
). Report is 18 commits behind head on main.
Files | Patch % | Lines |
---|---|---|
...tion_measure_definitions/fluctuation_complexity.jl | 90.00% | 1 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Is this really an Information
measure or a ComplexityEstimator
? Because it sounds like the latter. What axiomatic definition of an information measure does this satisfy? It isn't an entropy, or extropy, as far as I can tell.
Is this really an
Information
measure or aComplexityEstimator
? Because it sounds like the latter. What axiomatic definition of an information measure does this satisfy? It isn't an entropy, or extropy, as far as I can tell.
It is a functional of a PMF, so it is an information measure according to our definition.
We do not impose and demands on the axiomatic foundation of a measure in our API for it to quality as an information measure. Only that it is a functional of a PMF. Entropies and entropies are examples of information, measures, but not the only ones. The source of this method themselves call it a "information complexity measure":
I think we should stick with this simple distinction: non-probabilities-based or probabilities-based. Upstream, I've also named all measures that are functions of probabilities "information measures" (KL divergence, relative entropy, mutual info, etc) too, and they are computed using the information
function, just with more input datasets.
To add: this is an information measure in our framework because it is something that can be estimated using any outcome space, any probabilities estimator and any generic information measure estimator (as a consequence of being a probabilities functional):
information(Jackknife(FluctuationComplexity()), BayesianRegularization(), OrdinalPatterns(), x)
whereas things like SampleEntropy
cannot be estimated in this way.
okay, sounds good. distinction based on "functional of PMF or PDF" is fine for me.
okay, sounds good. distinction based on "functional of PMF or PDF" is fine for me.
Good, then we're on the same page 👍
What's this?
The fluctuation complexity characterizes the deviations of the information content of individual states from some summary statistic (information measure) computed for the same distribution. I've here generalized the original measure, which uses Shannon entropy, to be compatible with any
InformationMeasure
.Code changes
FluctuationComplexity
type, which can be used withinformation
in combination with any estimation method, just like the other entropies & friends.Shannon
entropy as the summary statistic.information_maximum
method for now, because I couldn't see any of generally computing the maximum value without some finicky additions to the codebase, which I don't want to deal with now.