Open Sandy4321 opened 3 years ago
same here https://www.geeksforgeeks.org/implementation-of-lasso-regression-from-scratch-using-python/ db = - 2 * np.sum( self.Y - Y_pred ) / self.m
# update weights
self.b = self.b - self.learning_rate * db
here https://github.com/llSourcell/linear_regression_live/blob/master/demo.py the same b_gradient += -(2/N) (y - ((m_current x) + b_current)) though " + b_current" is mistake
or sorry (y - ((m_current * x) + b_current)) is ok
in any case in your code derivative = 2 * np.dot(errors, feature) from https://github.com/wiqaaas/youtube/blob/master/Machine_Learning_from_Scratch/Ridge_Regression/Ridge_Regression_using_Gradient_Descent.ipynb
you use multiplication of errors by data when others not
may you clarify why intercept is not calculated separately in https://github.com/wiqaaas/youtube/blob/master/Machine_Learning_from_Scratch/Lasso_Regression/Lasso_Regression_using_Coordinate_Descent.ipynb
when it is calculated in https://github.com/wiqaaas/youtube/blob/master/Machine_Learning_from_Scratch/Ridge_Regression/Ridge_Regression_using_Gradient_Descent.ipynb