The docs are very thorough! I think one thing that could be added that could help improve model checking in general is showing posterior predictive checks against some of the data to give an impression of over / under prediction of the data given the model and the likelihood.
Here one could simulate future data from the conditioned model given the variability in the data to see if the posterior is, e.g. inflating the uncertainty in the data.
This is a stylistic choice, but has a lot of opportunity to teach the community as this is a a widely used tool.
The docs are very thorough! I think one thing that could be added that could help improve model checking in general is showing posterior predictive checks against some of the data to give an impression of over / under prediction of the data given the model and the likelihood.
For example:
https://gallery.exoplanet.codes/en/latest/tutorials/tess/
Here one could simulate future data from the conditioned model given the variability in the data to see if the posterior is, e.g. inflating the uncertainty in the data.
This is a stylistic choice, but has a lot of opportunity to teach the community as this is a a widely used tool.
Here is an example of what I mean: