Open hntee opened 5 years ago
In ridge_regression.py , wrote w* = argmin |t - X @ w| + alpha * |w|_2^2.
ridge_regression.py
w* = argmin |t - X @ w| + alpha * |w|_2^2
But according to https://en.wikipedia.org/wiki/Tikhonov_regularization , the loss is in second order.
Did I miss something?
In
ridge_regression.py
, wrotew* = argmin |t - X @ w| + alpha * |w|_2^2
.But according to https://en.wikipedia.org/wiki/Tikhonov_regularization , the loss is in second order.
Did I miss something?