Closed brett1479 closed 6 years ago
Yeah this would be great -- could you whip up some asymptote illustrating? You could make the level sets of w_1 + w_2 = c, until they hit the L1 or L2 norm ball. While we're at it, we could also take x_1 = 2x_2, to get level sets of 2w1+w2=c, to show that Lasso would select x1 while ridge would mix. And this is also the answer to my question on what the heck is that quora illustration showing.
Here are drawings with no labels. Let me know if you want anything else on them (labels, other markup). https://github.com/davidrosenberg/mlcourse-homework/tree/master/in-prep/recitations/L1vL2
These are great. Let's change the axes labels to w_1 and w_2, and give the equations for a few of the lines (as you did for the SVM), and also give the expression for the norm balls (so we know the radius). Thanks!
Do we want the pictures all at the same scale? At the moment I just picked convenient radii for the norm-balls so the contour lines intersect them, but I could fix the radii and change the contours accordingly.
Not really sure. I guess I'd have to see it and think about how I'd present it...
On Sun, Feb 5, 2017 at 7:20 PM, brett1479 notifications@github.com wrote:
Do we want the pictures all at the same scale? At the moment I just picked convenient radii for the norm-balls so the contour lines intersect them, but I could fix the radii and change the contours accordingly.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/davidrosenberg/mlcourse/issues/36#issuecomment-277562287, or mute the thread https://github.com/notifications/unsubscribe-auth/ABfT8GmvTgsA6FefNYhRD8BqjjPeoUsMks5rZmcxgaJpZM4LxbkW .
In the lab you give a picture proof that L2 regularization sets the coefficients to be equal when you have a duplicated feature. I think it makes sense to continue the picture discussion and see what happens when you use L1 regularization.