Closed schnirz closed 5 years ago
Merging #8 into master will decrease coverage by
1.32%
. The diff coverage is65.34%
.
@@ Coverage Diff @@
## master #8 +/- ##
==========================================
- Coverage 69.86% 68.54% -1.33%
==========================================
Files 5 6 +1
Lines 292 391 +99
==========================================
+ Hits 204 268 +64
- Misses 88 123 +35
Impacted Files | Coverage Δ | |
---|---|---|
src/pc.jl | 71.42% <46.66%> (-4.09%) |
:arrow_down: |
src/klentropy.jl | 66.66% <66.66%> (ø) |
|
src/skeleton.jl | 68.42% <87.5%> (+2.42%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update a31ef1c...b881246. Read the comment docs.
I suppose I should have seen this coming...I can start working on unit tests for the entropy estimators to get these numbers up again
Great! Some remarks
function n_ball(n::Number)::Float64
return π^(n/2.) / gamma(n/2. + 1.)
end
This makes
julia> ForwardDiff.derivative(n_ball, 10)
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(n_ball),Int64},Float64,1})
fail for no reason. I'd rather have
julia> function n_ball(n::Number) return π^(n/2.) / gamma(n/2. + 1.) end n_ball (generic function with 1 method)
julia> ForwardDiff.derivative(n_ball, 10) -0.7158154677840614
All code for mutual information estimators is contained in a new file,
klentropy.jl
.