jdwittenauer / ipython-notebooks

A collection of IPython notebooks covering various topics.
2.61k stars 1.51k forks source link

Linear regression Q1 #11

Closed Umartahir93 closed 6 years ago

Umartahir93 commented 6 years ago

def gradientDescent(X, y, theta, alpha, iters):
temp = np.matrix(np.zeros(theta.shape)) parameters = int(theta.ravel().shape[1]) cost = np.zeros(iters)

//why are we using below loop? Its a matrix multiplication I dont think we need to //loop here. It will always give the same answer. Can you please tell me what is the benefit of using loop here? for i in range(iters): error = (X * theta.T) - y

Umartahir93 commented 6 years ago

I understand now that is used for theta convergence oops my bad. It was easy.