rlabbe / Kalman-and-Bayesian-Filters-in-Python

Kalman Filter book using Jupyter Notebook. Focuses on building intuition and experience, not formal proofs. Includes Kalman filters,extended Kalman filters, unscented Kalman filters, particle filters, and more. All exercises include solutions.
Other
16.66k stars 4.19k forks source link

Computational Properties of Gaussians #297

Open dimidagd opened 5 years ago

dimidagd commented 5 years ago

A remarkable property of Gaussian distributions is that the sum of two independent Gaussians is another Gaussian.

This is true for gaussian random variables, not for their distributions. The sum of gaussian distributions is a mixture distribution.

Ryanglambert commented 5 years ago

should say product not sum

On Sun, Apr 28, 2019, 6:26 PM dimidagd notifications@github.com wrote:

A remarkable property of Gaussian distributions is that the sum of two independent Gaussians is another Gaussian.

This is true for gaussian random variables, not for their distributions. The sum of gaussian distributions is a mixture distribution.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python/issues/297, or mute the thread https://github.com/notifications/unsubscribe-auth/AA4YEPARUBORR7CZF7OEERDPSYQB7ANCNFSM4HI7QPJA .

Ryanglambert commented 5 years ago

The joint probability distribution (i.e. the product) of two gaussians, is gaussian.

On Sun, Apr 28, 2019, 7:55 PM Ryan Lambert ryan.g.lambert@gmail.com wrote:

should say product not sum

On Sun, Apr 28, 2019, 6:26 PM dimidagd notifications@github.com wrote:

A remarkable property of Gaussian distributions is that the sum of two independent Gaussians is another Gaussian.

This is true for gaussian random variables, not for their distributions. The sum of gaussian distributions is a mixture distribution.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python/issues/297, or mute the thread https://github.com/notifications/unsubscribe-auth/AA4YEPARUBORR7CZF7OEERDPSYQB7ANCNFSM4HI7QPJA .

dimidagd commented 5 years ago

It is saying so for the product as well, which why I am just trying to correct. Reader might get confused

zhenlin commented 5 years ago

This section appears to be using "sum" and "product" in two different ways.

rlabbe commented 4 years ago

I fixed this via #332, but am leaving this open until I consider all the cases zhenlin enumerated and checking if any further changes are required (they probably are).

risa2000 commented 3 years ago

The Computational Properties of Gaussians section starts by recalling a sum and a product of two random variables, which seems perfectly fine, but then it follows with:

Before we do the math, let's test this visually.

and then shows the "element-wise multiplication" of two "gaussians".

It seems that the example is trying to demonstrate that an analogy to the update operation from the previous chapter, where the posterior is calculated from the prior and the likelihood, conserves the "normality of the distribution", when working with discrete density functions (PMFs), and possibly can be extended to continuous representations (PDFs). But this update operation is neither product nor sum from the beginning of this section.

What I also find confusing is that while in the graph the functions are drawn as continuous curves (and the explanation below the graph also suggests using of Gaussian functions), but in the calculation, they are treated as discrete probability distributions (PMFs), sampled (and normalized) out of the normal distribution function (PDF). So it is not simply PDF_1(x)*PDF_2(x) as would the intuition suggest, and it also explains why the Y-axis values depend on the sampling rate.

So, while technically correct per se, the placement of the graph I find misleading, because it does not "test visually" anything already mentioned at the beginning, and instead introduces yet another interesting property of the gaussians.

EDIT: I realized that I was not fair in my previous comment as the book actually mentions:

There we can say that the result of multiplying two Gaussian distributions is a Gaussian function (recall function in this context means that the property that the values sum to one is not guaranteed.

right at the beginning. Which is what the following example tries to demonstrate.

So I guess my final comment would be that the wording might be better to clearly distinguish when we operate on random variables and when we do that on their probability functions and possibly also add some motivation, why we might want to do one or the other on the journey to our goal.