Closed kahaaga closed 6 months ago
All modified and coverable lines are covered by tests :white_check_mark:
:exclamation: No coverage uploaded for pull request base (
tutorial@f010a01
). Click here to learn what that means.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Fixed missing methods and exports (why were they removed to begin with, @Datseris? I'm thinking specifically of the allcounts etc methods that were generated previously)
If the purpose was to have fewer methods, I'm okay with dropping allprobabilities
in favor of just having allprobabilities_and_outcomes
. But then all references to the method has to be removed from both the docs, the docstrings, and the tests.
Otherwise I'm done here.
- Not sure what is the best. Here I just modified the examples in the tutorial to be explicit about using
Shannon
I disagree with that. entropy
must default to Shannon. To my understanding so far in the wide literature of NLTS 99.9% of cases an entropy is computed, this is the Shannon entropy. Hence this is a sensible default.
For all the rest, okay!
I disagree with that. entropy must default to Shannon. To my understanding so far in the wide literature of NLTS 99.9% of cases an entropy is computed, this is the Shannon entropy. Hence this is a sensible default.
Ok, I'm fine with Shannon entropy being the default.
Should I implement it or will you do it? It's just a matter of removing the Shannon
instances from the tutorial and changing the else
case here
function entropy(args...)
e = first(args)
# Check the condition for throwing an error (if false)
cond = if e isa ProbEstOrOutcomeSpace
# Shannon is used as default information measure
true
elseif e isa InformationMeasure
# Any subtype of entropy
e isa Entropy
elseif e isa InformationMeasureEstimator
# Estimator is for any subtype of entropy
e.definition isa Entropy
else
false
end
cond || throw(ArgumentError("""
You have used `entropy` without an entropy definition
($(typeof(e))). Use `information` instead."""))
return information(args...)
end
- why were they removed to begin with, @Datseris?
That was when counts
became the main method and counts_and_outcomes
was in the deprecation file. Now of course this is incorrect, as the _and_outcomes
is the correct method. Your changes are correct in any case!
Regarding allcounts
: yes, neither allcounts
nor allprobabilities
should exist at all. Both must anwyays explicitly compute all outcomes, hence they have no reason to exist; they can never be optimized to not calculate outcomes. Hence, only the _and_outcomes
versions should exist for both. So we should just add allcounts
and allproabilities
to deprecations file and simply remove all refernece of them from the docs! Let me mnow if you do this in this PR or we do it in the tutorial PR after merging this one in.
Regarding allcounts: yes, neither allcounts nor allprobabilities should exist at all. Both must anwyays explicitly compute all outcomes, hence they have no reason to exist; they can never be optimized to not calculate outcomes. Hence, only the _and_outcomes versions should exist for both. So we should just add allcounts and allproabilities to deprecations file and simply remove all refernece of them from the docs! Let me mnow if you do this in this PR or we do it in the tutorial PR after merging this one in.
I can add the deprecations here, no problem.
Should I implement it or will you do it? It's just a matter of removing the
Shannon
instances from the tutorial and changing theelse
case here
Since in my branch Shannon was the default, please restore this behjavior before we merge your PR into mine.
Since in my branch Shannon was the default, please restore this behavior before we merge your PR into mine.
I haven't changed the behavior of the code, just the doc examples. Your doc build failed because entropy(p::Probabilities)
isn't implemented. It also isn't tested for. But I'll fix both before merging here 👍
ah sorry. ha my mind was stuck so far in the past where shannon was the default for entropy. that was proably so far in the past before even the information
function.
ah sorry. ha my mind was stuck so far in the past where shannon was the default for entropy. that was proably so far in the past before even the information function.
Yes, I think this is from 1.X or so :D
Some necessary changes to get #347 to pass CI and adhere to API.
probabilities_and_outcomes
andallprobabilities_and_outcomes
are consistently the methods implemented, andprobabilities
/allprobabilities
are just calls to those functions. The exception is if implementing the latter methods is faster.allcounts
etc methods that were generated previously)entropy(::Probabilities)
, which was used in the doc tutorial, is not a valid syntax. I am happy with that, since I like to specify which entropy one wants to compute. However, we can also dispatch automatically toentropy(Shannon(), p)
. Not sure what is the best. Here I just modified the examples in the tutorial to be explicit about usingShannon