Closed kahaaga closed 1 year ago
Merging #168 (ce728ad) into main (cce1c15) will increase coverage by
0.92%
. The diff coverage is90.21%
.
@@ Coverage Diff @@
## main #168 +/- ##
==========================================
+ Coverage 84.01% 84.94% +0.92%
==========================================
Files 37 37
Lines 951 963 +12
==========================================
+ Hits 799 818 +19
+ Misses 152 145 -7
Impacted Files | Coverage Δ | |
---|---|---|
src/deprecations.jl | 0.00% <0.00%> (ø) |
|
.../estimators/nearest_neighbors/nearest_neighbors.jl | 100.00% <ø> (ø) |
|
...es/estimators/order_statistics/order_statistics.jl | 100.00% <ø> (ø) |
|
...ities_estimators/histograms/rectangular_binning.jl | 80.89% <ø> (ø) |
|
src/entropy.jl | 69.69% <72.22%> (-2.18%) |
:arrow_down: |
src/probabilities.jl | 58.62% <75.00%> (ø) |
|
src/entropies/convenience_definitions.jl | 100.00% <100.00%> (ø) |
|
...estimators/nearest_neighbors/KozachenkoLeonenko.jl | 100.00% <100.00%> (ø) |
|
.../entropies/estimators/nearest_neighbors/Kraskov.jl | 100.00% <100.00%> (ø) |
|
src/entropies/estimators/nearest_neighbors/Zhu.jl | 100.00% <100.00%> (ø) |
|
... and 21 more |
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
In summary, entropy API is now
entropy(e::Entropy, est::EntropyEstimator, x)
entropy(e::Entropy, est::ProbabilityEstimator, x)
Why make a change to the existing API, that was entropy(e::Entropy, x, est::ProbabilityEstimator)
? What was wrong with that? Can we focus on minimizing the change here, given that this seems to be a change that is up to preference without much practical implications? Or there are practical arguments to change the whole API? entropy(e::Entropy, x, est::EntropyEstimator)
would be the only change if we don't change hte probabilities stuff.
Why make a change to the existing API, that was entropy(e::Entropy, x, est::ProbabilityEstimator)? What was wrong with that? Can we focus on minimizing the change here, given that this seems to be a change that is up to preference without much practical implications? Or there are practical arguments to change the whole API? entropy(e::Entropy, x, est::EntropyEstimator) would be the only change if we don't change hte probabilities stuff.
Two reasons:
Following Julia's style guide, which requires writing functions with argument ordering similar to Julia Base. This means types first, then input data. We make a huge effort to write good, clear and consistently formatted code. There's no reason why this shouldn't apply for entropy
& friends.
Having the input data as the last argument makes it easier to provide multiple datasets. Then one can do entropy(Shannon(), Kraskov(), x, y, z, ...)
to compute the joint entropy between x, y, z, ...
. This simplifies code upstream in CausalityTools too.
The order of the arguments for CausalityTools functions in V2 will be someinfomeasure(::TypeOfMeasure, ::Estimator, input_x, input_y, ....)
(because reasons 1 and 2). It would be nice if we speak the same language across packages.
That said, which I completely forgot, because I don't explicitly use it in CausalityTools at the moment - we should also change the order of the arguments for probabilities
.
I don't see any downside of doing so, other than "it is a change". But we're releasing 2.0, so we might as well be consistent everywhere.
ok, are you changing the order here? There is a massive git conflict.
ok, are you changing the order here?
Yes, I will change the order here.
There is a massive git conflict.
Yes, I think that's because of the folder structure change. However, that can easily be fixed. I'll fix it as I do the remaining changes for the probabilities
.
@Datseris The refactoring, according to our discussion above, is now ready. All tests pass, and the documentation looks good.
The git diff looks massive due to folder structure changes, but in reality, there aren't really any changes except for the ordering of method arguments. Exact overview is found in my updated initial comment.
Work in progress. Addresses #167. I will update this when #162 is merged.
Changes
Small bug fix for
Zhu
andZhuSingh
estimators: thebase
belongs in the entropy struct, not in the estimator.IndirectEntropy
is nowEntropyEstimator
. Documentation strings for subtypes ofEntropyEstimator
explicitly say what type of entropy they estimate. For example, the docstring forKraskov
says that it estimates Shannon entropy.Folder structures, both for source code and tests, are changed to reflect this conceptual change.
B
lives in foldersrc/entropies/estimators/...
, and its tests live intest/entropies/estimators/B.jl
A
lives in foldersrc/probabilities_estimators/A
, and its tests live intest/probabilities/estimators/A.jl
.Order of arguments
See discussion above for justification for the following changes.
Entropy API is now
entropy(e::Entropy, est::EntropyEstimator, x)
entropy(e::Entropy, est::ProbabilityEstimator, x)
entropy(e::Entropy, probs::Probabilities
)entropy_maximum
andentropy_normalized
have been adjusted accordingly.Probabilities API is now
probabilities(est::ProbabilitiesEstimator, x)
TODO: