[x] Spelling is inconsistent (I can see which parts were written by who depending on spelling!) - pedantic, but easy to fix.
[x] P2 “which scales…” -> “which naively scales…”
[x] P4 “darfs” -> “dwarfs”
[x] P5 Examples - you commit to details (like commenting on using MCMC, the type of covariance function etc) all well ahead of the text which describes the details. It might leave a reader not familiar with GPs to get confused - just a thought.
[x] P8 Fig. 4 “thick thick” -> “thick”. Further this line might represent a finite Markov linkage - it is only for e.g. the SE kernel that this is infinitely fully connected. Other kernels (like Matern) can be defined using a finite state-space representation, and so with a finite Markov linkage.
[x] P10 “smooth” - you might want to have a footnote defining what you mean here (like differentialability etc).
[x] P12 White noise term addition - this is particularly important, yet gets little bandwidth - many readers will not quite understand how this works and w.n. Kernel gets added.
[x] P24 uncertainties on input variables are ignored. You might consider commenting that this can be avoided - for example the GPz work we did with Matt Jarvis has sparse GPs with full input uncertainties over observations.