airlab-unibas / airlab

Image registration laboratory for 2D and 3D image data
Apache License 2.0
408 stars 92 forks source link

Mutual Information loss H #25

Closed connorlee77 closed 4 years ago

connorlee77 commented 4 years ago

Can you be a bit more transparent about how you calculate the mutual information loss? Particularly H. How did you guys calculate the joint histogram?

ekhahniii commented 4 years ago

Hello, I too am curious about the formulation for calculating the joint histogram. Maybe you could comment a bit on the motivation for this line:

p_joint = th.mm(p_f, p_m.transpose(0, 1)).div(self._normalizer_2d)

@ChristophJud @RobinSandkuehler

connorlee77 commented 4 years ago

@ekhahniii the histograms are calculated via kernel density estimation. If you write out the multivariate version with a gaussian kernel, you'll notice you can decompose it into the product of two univariate kernels (ignoring any scaling terms). The matrix multiply operation combines this product as well as the summation.