Closed ikottlarz closed 1 year ago
Merging #268 (6b0b898) into main (6bdeaf2) will increase coverage by
0.06%
. The diff coverage is100.00%
.:exclamation: Current head 6b0b898 differs from pull request most recent head 65ee9e9. Consider uploading reports for the commit 65ee9e9 to get more accurate results
@@ Coverage Diff @@
## main #268 +/- ##
==========================================
+ Coverage 82.54% 82.61% +0.06%
==========================================
Files 51 51
Lines 1318 1317 -1
==========================================
Hits 1088 1088
+ Misses 230 229 -1
Impacted Files | Coverage Δ | |
---|---|---|
src/complexity_measures/statistical_complexity.jl | 100.00% <100.00%> (ø) |
... and 1 file with indirect coverage changes
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
This solution should be okay. It generalizes StatisticalComplexity
to any naive plug-in estimate entropy (internally represented as MLEntropy
). When #237 is done, there will more estimators where we don't know the maximum entropy, because not all (or any, I think) of the error-corrected estimators provide min-max bounds on the correction. But that is ok, because we simply don't define entropy_maximum
for them, and the code will fall back on the existing error messages.
As for having simple tests, that's fine.
Looks good to me, so merging!
@ikottlarz Please see comment https://github.com/JuliaDynamics/ComplexityMeasures.jl/issues/266#issuecomment-1628901995 in #266
Would using allprobabilities
change/improve anything in the code here, or isn't it relevant to the statistical complexity?
allprobabilities
is a function that guaranteed returns probabilities for every outcome in Ω. It is guaranteed that length(allprobabilities(est, x)) == total_outcomes(est, x)
.
Hi @kahaaga @Datseris , sorry for the late reply!
Would using allprobabilities change/improve anything in the code here, or isn't it relevant to the statistical complexity?
Using it did makes a small change (see current commit): Before, I was basically creating an "allprobabilities" vector manually, now I'm just using that function. I also added an error message for the case someone tries to call complexity(c, p)
with an "incomplete" probabilities vector.
Using it did makes a small change (see current commit): Before, I was basically creating an "allprobabilities" vector manually, now I'm just using that function. I also added an error message for the case someone tries to call complexity(c, p) with an "incomplete" probabilities vector.
Perfect! Thank you for your effort.
@ikottlarz I slightly rewrote the keyword argument in the docstring, linking to to ProbabilititesEstimator
and EntropyDefinition
, and mentioned that the distance measure is actually from Distances.jl, so that the user can more easily understand what to input to these arguments. I also clarified in the main docstring that your implementation is actually a generalisation of the original method.
Could you please check that what I wrote is factually true, and that you like the wording? If so, I'll merge this right away.
Hi there, this is a generalization of the
StatisticalComplexity
for non-Shannon entropies, solving #266 .I ended up not creating another version of
entropy_normalized
but just writing the expression for normalization in there - hope that's fine! Added some small tests, but since we don't have any literature values for non-Shannon entropies they are very minor.