asadoughi / stat-learning

Notes and exercise attempts for "An Introduction to Statistical Learning"
http://asadoughi.github.io/stat-learning
2.13k stars 1.62k forks source link

p2 ch7 #57

Open sijunhe opened 9 years ago

sijunhe commented 9 years ago

Hey, I think your answer for p2 ch7 is wrong. (a) Since we are minimizing the area under the curve of g(x)^2, g(x) would be just 0. (b) Since we are minimizing the area under the curve of g'(x)^2, g(x)' would be just 0 and g(x) = k, where k is some constant. (c) Since we are minimizing the area under the curve of g''(x)^2, g(x)'' would be just 0 and g(x) = ax+b, which is a straight line passing through the points. (d) g'''(x)= 0, so g(x) =ax^2+bx+c, which is a quadratic line passing through the points. (e) the penalty term is dropped, so g(x) is a interpolant that goes through all the points to make RSS = 0.

bidaning commented 8 years ago

agreed

leowang396 commented 4 years ago

Agreed.

To add on, the answer to (c) would be the least squares linear regression fit. The answer to (d) is the least squares quadratic fit.