kellino / DiscreteEntropy.jl

Estimation of Shannon Entropy for Discrete Random Variables
MIT License
2 stars 1 forks source link

DiscreteEntropy.jl

Build Status Coverage License

DiscreteEntropy is a Julia package to estimate the Shannon entropy of discrete random variables. It contains implementations of many popular entropy estimators, such as Chao-Shen, NSB, Miller-Madow and various Bayesian estimators. Moreoever, it also contains functions for estimating cross entropy, KL divergence, mutual information, conditional information, Theil's U and other entropy measures. It supports Jackknife and Bayesian Bootstrap resampling for data poor estimation.

For more information, see the documentation.

Quick Example

julia> using DiscreteEntropy
julia> data = [1,2,3,4,3,2,1];
julia> h = estimate_h(from_data(data, Histogram), ChaoShen)
julia> 2.0775715569320012

Contributing and Bugs

Please see CONTRIBUTING.md for details on how to contribute to the project through pull requests or issues.