made many small edits at the request of @mitpress, including adding 100s of "Oxford commas".
completely rewrote ch 25 on diffusion models (now includes SDEs).
improved ch 8 on Gaussian filtering/ smoothing (pseudocode now matches our dynamax library :)
added short new section on hypothesis testing (sec 3.12)
moved HMM forwards backwards into sec 9.2 (message passing on chains)
moved some stuff from the main text to the online supplement to meet the page limits.
Specifically, moved non-parametric Bayes (ch 31) and LVMs for graphs (sec 30.2) to online.
Moved sec 3.2 (Bayesian concept learning) back into main.
Moved sec 15.3.9 (logreg for Berkeley admissions) back into main.
added more details on basics of probability theory (sec 2.1)
tweaked presentation of particle filering resampling algorithms (sec 13.2.4)
tweaked section on text generation with transformers (sec 22.4.1) to mention chatGPT
fixed typos, and other cosmetic things (eg changed some chapter titles)
made many small edits at the request of @mitpress, including adding 100s of "Oxford commas".
completely rewrote ch 25 on diffusion models (now includes SDEs).
improved ch 8 on Gaussian filtering/ smoothing (pseudocode now matches our dynamax library :)
added short new section on hypothesis testing (sec 3.12)
moved HMM forwards backwards into sec 9.2 (message passing on chains)
moved some stuff from the main text to the online supplement to meet the page limits. Specifically, moved non-parametric Bayes (ch 31) and LVMs for graphs (sec 30.2) to online. Moved sec 3.2 (Bayesian concept learning) back into main. Moved sec 15.3.9 (logreg for Berkeley admissions) back into main.
added more details on basics of probability theory (sec 2.1)
tweaked presentation of particle filering resampling algorithms (sec 13.2.4)
tweaked section on text generation with transformers (sec 22.4.1) to mention chatGPT
fixed typos, and other cosmetic things (eg changed some chapter titles)