probml / pml2-book

Probabilistic Machine Learning: Advanced Topics
MIT License
1.4k stars 119 forks source link

Approximate nonlinear Gaussian BP (version Jul 29, 2022) #118

Closed e-pet closed 2 years ago

e-pet commented 2 years ago

Thanks for creating this amazing resource! :-) I have some minor suggestions regarding the characterization of two papers I was involved in.

You currently write (p. 371, section 3.8.4 on Inference for state-space models / inference based on statistical linearization):

In [HPR19] they extend IPLS [Iterated Posterior Linearization Smoother] to belief propagation in Forney factor graphs (Section 4.6.1.2), which enables the method to be applied to general graphical models with Gaussian potentials but nonlinear dependencies

and (p. 392, section 3.9.2. on Inference for graphical models / Loopy BP / Gaussian BP):

To perform message passing in models with non-Gaussian potentials, we can extend the techniques from Section 8.5.2 from chains to general graphs. For example, we can use local linearization, similar to extended Kalman filter (see [PHR18]); or we can use sigma point BP [MHH14], similar to unscented Kalman filter.

These characterizations are not entirely precise.

Hope this helps and does not come across as picky / vain. :-)

Thanks again for your tremendous efforts!

murphyk commented 2 years ago

ok, i rewrote the sec on p371 as follows

In \citep{Herzog2019} they extend IPLS to belief propagation
in Forney factor graphs (\cref{sec:FFG}),
which enables the method to be applied to a large class of graphical models
beyond \SSMs. In particular, they give a general linearization formulation
(including explicit message update rules) for nonlinear approximate Gaussian BP
(\cref{sec:gaussBP})
where the linearization can be Jacobian-based (``EKF-style''), statistical
(moment matching / quadrature filtering / sigma points), or anything else.
They also show how any such linearization method can benefit from iterations.

and on p392 as follows

To perform message passing in models with non-linear (but Gaussian) potentials,
we can generalize the extended Kalman filter
techniques from \cref{sec:EKF}
and the moment matching techniques (based on quadrature / sigma points)
from \cref{sec:GGF,sec:statlin}
from chains to general factor graphs
(see e.g., \citep{SigmaBP,Petersen2018,Herzog2019}).

is this correct?

e-pet commented 2 years ago

Yes, that looks perfect! Thanks a lot. 😊

From: Kevin P Murphy @.> Sent: 8. august 2022 04:09 To: probml/pml2-book @.> Cc: e-pet @.>; Author @.> Subject: Re: [probml/pml2-book] Approximate nonlinear Gaussian BP (version Jul 29, 2022) (Issue #118)

ok, i rewrote the sec on p371 as follows

In \citep{Herzog2019} they extend IPLS to belief propagation

in Forney factor graphs (\cref{sec:FFG}),

which enables the method to be applied to a large class of graphical models

beyond \SSMs. In particular, they give a general linearization formulation

(including explicit message update rules) for nonlinear approximate Gaussian BP

(\cref{sec:gaussBP})

where the linearization can be Jacobian-based (``EKF-style''), statistical

(moment matching / quadrature filtering / sigma points), or anything else.

They also show how any such linearization method can benefit from iterations.

and on p392 as follows

To perform message passing in models with non-linear (but Gaussian) potentials,

we can generalize the extended Kalman filter

techniques from \cref{sec:EKF}

and the moment matching techniques (based on quadrature / sigma points)

from \cref{sec:GGF,sec:statlin}

from chains to general factor graphs

(see e.g., \citep{SigmaBP,Petersen2018,Herzog2019}).

is this correct?

— Reply to this email directly, view it on GitHubhttps://github.com/probml/pml2-book/issues/118#issuecomment-1207572267, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AANRE747CRNA67K4DMI22P3VYBT43ANCNFSM55E7GG5A. You are receiving this because you authored the thread.Message ID: @.**@.>>