paulbrodersen / entropy_estimators

Estimators for the entropy and other information theoretic quantities of continuous distributions
GNU General Public License v3.0
132 stars 26 forks source link

Does "partial mutual information" here mean "conditional mutual information"? #13

Closed lupupu closed 2 years ago

lupupu commented 2 years ago

https://github.com/paulbrodersen/entropy_estimators/blob/11fad3c048d681b5e2117288d051734b2e886b7b/entropy_estimators/continuous.py#L336

paulbrodersen commented 2 years ago

The two terms are often used interchangeably.

I give the mathematical definition two lines later in the doc-string. I(X,Y|Z) = H(X,Z) + H(Y,Z) - H(X,Y,Z) - H(Z)

Basically, you are trying to compute the mutual information between process X and Y while ignoring the contributions of Z to both X and Y.

Does that answer your question?

paulbrodersen commented 2 years ago

Closing the issue due to inactivity. Feel free to re-open, if necessary.