I found for some continuous variables, the entropy_estimators library return the negative number. Here is the reply I got from the author of this library,
For continuous variables, this package is calculating the differential entropy. Unfortunately, the differential entropy can be negative, making interpretation more difficult than in the discrete case. See chapter 8 of Cover and Thomas, for example, for a discussion of how to interpret negative differential entropies. (Consider, for instance, the differential entropy for a Gaussian which is proportional to log variance. If the variance is small, you get a negative number.)
My question is for the information theoretical based methods which use this library for entropy calculation, if the entropy result is negative, will the feature selection result still be valid?
Thanks
I found for some continuous variables, the entropy_estimators library return the negative number. Here is the reply I got from the author of this library,
For continuous variables, this package is calculating the differential entropy. Unfortunately, the differential entropy can be negative, making interpretation more difficult than in the discrete case. See chapter 8 of Cover and Thomas, for example, for a discussion of how to interpret negative differential entropies. (Consider, for instance, the differential entropy for a Gaussian which is proportional to log variance. If the variance is small, you get a negative number.)
My question is for the information theoretical based methods which use this library for entropy calculation, if the entropy result is negative, will the feature selection result still be valid? Thanks