Open tienthach opened 3 years ago
In regression, the inputs (x) are fixed, i.e., we are only allowed to modify the corresponding y-values. In PCA, we were looking at the problem of projecting a two-dimensional dataset onto a one-dimensional subspace. If we were to apply the same in Figure 9.12, the x-values would also change, but that's not allowed.
Doest this help?
Describe the mistake In Figure 9.12b, the orange dots are not the projections of the noisy observations (blue dots) onto the line (?). It is not like in the situation of PCA. There is a difference between Figure 9.12b and Figure 3.9 on page 82. I understood in Fig. 9.12b it is an orthogonal projection because, in the last paragraph of page 95 of chapter 3, you wrote: "This also has applications in machine learning, e.g., in linear regression where we want to find a (linear) function that minimizes the residual errors, i.e., the lengths of the orthogonal projections of the data onto the linear function (Bishop, 2006)."
MLE method of linear regression (orthogonal) projects the label vector onto the subspace spanned by the column vectors of the data matrix X.
Location Please provide the
Proposed solution A clear and concise description of what you would like to change.
Additional context Add any other context about the problem here.