Open stoianchoo opened 7 years ago
The code inside the "for" loop is missing '
def svm_sgd_plot(X, Y):
#Initialize our SVMs weight vector with zeros (3 values)
w = np.zeros(len(X[0]))
#The learning rate
eta = 1
#how many iterations to train for
epochs = 100000
#store misclassifications so we can plot how they change over time
errors = []
#training part, gradient descent part
for epoch in range(1, epochs):
error = 0
for i, x in enumerate(X):
if (Y[i] * np.dot(X[i], w))<1:
#missclassified weights
w = w+ eta *((X[i] * Y[i]) + (-2 * (1/epoch) *w ))
error = 1
else:
w = w + eta *(-2 * (1/epoch) *w)
errors.append(error)
#lets plot the rate of classification errors during training for our SVM
plt.plot(errors, '|')
plt.ylim(0.5,1.5)
plt.axes().set_yticklabels([])
plt.xlabel('Epoch')
plt.ylabel('Misclassified')
plt.show()
return w`
Also add w = svm_sgd_plot(X,y) in cell [9] before you plot the graph. `
for d, sample in enumerate(X):
# Plot the negative samples
if d < 2:
plt.scatter(sample[0], sample[1], s=120, marker='_', linewidths=2)
# Plot the positive samples
else:
plt.scatter(sample[0], sample[1], s=120, marker='+', linewidths=2)
w = svm_sgd_plot(X,y)
# Add our test samples
plt.scatter(2,2, s=120, marker='_', linewidths=2, color='yellow')
plt.scatter(4,3, s=120, marker='+', linewidths=2, color='blue')
# Print the hyperplane calculated by svm_sgd()
x2=[w[0],w[1],-w[1],w[0]]
x3=[w[0],w[1],w[1],-w[0]]
x2x3 =np.array([x2,x3])
X,Y,U,V = zip(*x2x3)
ax = plt.gca()
ax.quiver(X,Y,U,V,scale=1, color='blue')'
Thanks @ajclaros, that fixes some of the problems. However the last section still doesn't quite work for me.
w = svm_sgd_plot(X,y)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-15-187fcd47c636> in <module>()
----> 1 w = svm_sgd_plot(X,y)
2 #they decrease over time! Our SVM is learning the optimal hyperplane
<ipython-input-13-316b8b5d9d92> in svm_sgd_plot(X, Y)
3 def svm_sgd_plot(X, Y):
4 #Initialize our SVMs weight vector with zeros (3 values)
----> 5 w = np.zeros(len(X[0]))
6 #The learning rate
7 eta = 1
TypeError: object of type 'numpy.float64' has no len()
Any ideas? Is this even relevant? I'm still trying to wrap my head around some of the code...
Cheers
@sebkouba can you reproduce the problem when running from terminal instead of ipython? ie copying and pasting all the code in a new your-chosen-name-here.py then running it from the command prompt (for windows) or from the terminal (for macOS).
@sebkouba try to put w = svm_sgd_plot(X,y) on the top after the end of function `w = svm_sgd_plot(X,y) for d, sample in enumerate(X):
if d < 2:
plt.scatter(sample[0], sample[1], s=120, marker='_', linewidths=2)
# Plot the positive samples (the last 3)
else:
plt.scatter(sample[0], sample[1], s=120, marker='+', linewidths=2)
plt.scatter(2,2, s=120, marker='_', linewidths=2, color='yellow') plt.scatter(4,3, s=120, marker='+', linewidths=2, color='blue')
x2=[w[0],w[1],-w[1],w[0]] print(x2) x3=[w[0],w[1],w[1],-w[0]] print(x3)
x2x3 =np.array([x2,x3]) X,Y,U,V = zip(*x2x3) ax = plt.gca() ax.quiver(X,Y,U,V,scale=1, color='red')
`
Hi, I'm getting an
on the
#lets perform stochastic gradient descent to learn the seperating hyperplane between both classes
cell.I'm running this on:
on the latest current commit caa60de344a122902d1de631e8c8316bca542829