Leap 2.0 is a platform offered by Complete open Source Solutions(COSS),where students from different campus irrespective of all platforms can join to explore themselves with the limitless IT world by gaining knowledge on open source cutting edge technologies on demand and get opportunity to work on live projects.
3
stars
6
forks
source link
Stochastic Gradient Descent for linear regression #15
To implement stochastic gradient descent to optimize a linear regression algorithm on Boston House Prices dataset which is already exists in sklearn as a sklearn.linear_model.SGDRegressor.here,SGD algorithm is defined manually and then comapring the both results.Linear regression is technique to predict on real values.
stochastic gradient descent technique , evaluates and updates the coefficients every iteration to minimize the error of a model on training data.
Objective:
To Implement stochastic gradient descent on Bostan House Prices dataset for linear Regression
Implement SGD and deploy on Bostan House Prices dataset.
Comapare the Results with sklearn.linear_model.SGDRegressor
Compare the results when Learning rate is constant
To implement stochastic gradient descent to optimize a linear regression algorithm on Boston House Prices dataset which is already exists in sklearn as a sklearn.linear_model.SGDRegressor.here,SGD algorithm is defined manually and then comapring the both results.Linear regression is technique to predict on real values.
stochastic gradient descent technique , evaluates and updates the coefficients every iteration to minimize the error of a model on training data.
Objective:
To Implement stochastic gradient descent on Bostan House Prices dataset for linear Regression