Closed kahaaga closed 1 year ago
sure you can do this if you want. I don't care because this is a change that has no actual impact, nothing would change in neither our code nor future extensions.
sure you can do this if you want. I don't care because this is a change that has no actual impact, nothing would change in neither our code nor future extensions.
Ok, I'll fix it after #227 is merged.
Since this doesn't change anything for the user of Entropies.jl, this doesn't need to go in 2.0. It is a non-breaking change, so can just go in 2.1. Let's not let more stuff hold the new release back.
@Datseris Sorry, one last thing that popped up as you work on #227 A thing that has been bugging me wile working on upstream stuff is as follows:
Why isn't
Shannon
anEntropyDefinition
in itself? There are multiple entropy definitions that, in some limit, approach the Shannon entropy. This is the case at least forTsallis
,Kaniadakis
andRenyi
, and probably many more. It seems completely arbitrary thatShannon
should be an alias forRenyi
with a certain parameter.This alias made sense when we only had Shannon and Renyi entropy, but that is no longer the case. There will be many more entropies, many of which also convergence to Shannon entropy.
This has bugged me a while when working on upstream stuff. It means I have to write a Renyi-to-Shannon wrapper for every method that uses the Shannon entropy.
In information theory, Shannon is the entropy. The star of the show, so to speak. I think it should be its own type.