mml-book / mml-book.github.io

Companion webpage to the book "Mathematics For Machine Learning"
13.04k stars 2.41k forks source link

Possible mistake in Figure 9.12b, chapter 9.4 #594

Open tienthach opened 3 years ago

tienthach commented 3 years ago

Describe the mistake In Figure 9.12b, the orange dots are not the projections of the noisy observations (blue dots) onto the line (?). It is not like in the situation of PCA. There is a difference between Figure 9.12b and Figure 3.9 on page 82. I understood in Fig. 9.12b it is an orthogonal projection because, in the last paragraph of page 95 of chapter 3, you wrote: "This also has applications in machine learning, e.g., in linear regression where we want to find a (linear) function that minimizes the residual errors, i.e., the lengths of the orthogonal projections of the data onto the linear function (Bishop, 2006)."

MLE method of linear regression (orthogonal) projects the label vector onto the subspace spanned by the column vectors of the data matrix X.

Location Please provide the

  1. version (bottom of page) the most up-to-date version of the book (download 21/10/2020 )
  2. Chapter 9.4
  3. page 313
  4. line number/equation number

Proposed solution A clear and concise description of what you would like to change.

Additional context Add any other context about the problem here.

mpd37 commented 3 years ago

In regression, the inputs (x) are fixed, i.e., we are only allowed to modify the corresponding y-values. In PCA, we were looking at the problem of projecting a two-dimensional dataset onto a one-dimensional subspace. If we were to apply the same in Figure 9.12, the x-values would also change, but that's not allowed.

Doest this help?