llSourcell / Classifying_Data_Using_a_Support_Vector_Machine

This is the code for the "Classifying Data using Gradient Descent" by Siraj Raval on Youtube
217 stars 181 forks source link

Indentation error #1

Open stoianchoo opened 7 years ago

stoianchoo commented 7 years ago

Hi, I'm getting an

File "<ipython-input-2-089aa09c1903>", line 18
    plt.plot(errors, '|')
      ^
IndentationError: expected an indented block

on the #lets perform stochastic gradient descent to learn the seperating hyperplane between both classes cell.

I'm running this on:

You are using Jupyter notebook.

The version of the notebook server is 4.3.1 and is running on:
Python 3.6.0 |Anaconda 4.3.0 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)]

Current Kernel Information:

Python 3.6.0 |Anaconda 4.3.0 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)]
Type "copyright", "credits" or "license" for more information.

IPython 5.1.0 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.

on the latest current commit caa60de344a122902d1de631e8c8316bca542829

ajclaros commented 7 years ago

The code inside the "for" loop is missing '

def svm_sgd_plot(X, Y):
    #Initialize our SVMs weight vector with zeros (3 values)
    w = np.zeros(len(X[0]))
    #The learning rate
    eta = 1
    #how many iterations to train for
    epochs = 100000
    #store misclassifications so we can plot how they change over time
    errors = []

    #training part, gradient descent part
    for epoch in range(1, epochs):
        error = 0
        for i, x in enumerate(X):
            if (Y[i] * np.dot(X[i], w))<1:
            #missclassified weights
                w = w+ eta *((X[i] * Y[i]) + (-2 * (1/epoch) *w ))
                error = 1
            else:
                w = w + eta *(-2 * (1/epoch) *w)
    errors.append(error)

#lets plot the rate of classification errors during training for our SVM
plt.plot(errors, '|')
plt.ylim(0.5,1.5)
plt.axes().set_yticklabels([])
plt.xlabel('Epoch')
plt.ylabel('Misclassified')
plt.show()

return w`

Also add w = svm_sgd_plot(X,y) in cell [9] before you plot the graph. `

for d, sample in enumerate(X):
    # Plot the negative samples
    if d < 2:
        plt.scatter(sample[0], sample[1], s=120, marker='_', linewidths=2)
    # Plot the positive samples
    else:
        plt.scatter(sample[0], sample[1], s=120, marker='+', linewidths=2)

w = svm_sgd_plot(X,y)
# Add our test samples
plt.scatter(2,2, s=120, marker='_', linewidths=2, color='yellow')
plt.scatter(4,3, s=120, marker='+', linewidths=2, color='blue')

# Print the hyperplane calculated by svm_sgd()
x2=[w[0],w[1],-w[1],w[0]]
x3=[w[0],w[1],w[1],-w[0]]

x2x3 =np.array([x2,x3])
X,Y,U,V = zip(*x2x3)
ax = plt.gca()
ax.quiver(X,Y,U,V,scale=1, color='blue')'
sebkouba commented 7 years ago

Thanks @ajclaros, that fixes some of the problems. However the last section still doesn't quite work for me.

w = svm_sgd_plot(X,y)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-15-187fcd47c636> in <module>()
----> 1 w = svm_sgd_plot(X,y)
      2 #they decrease over time! Our SVM is learning the optimal hyperplane

<ipython-input-13-316b8b5d9d92> in svm_sgd_plot(X, Y)
      3 def svm_sgd_plot(X, Y):
      4     #Initialize our SVMs weight vector with zeros (3 values)
----> 5     w = np.zeros(len(X[0]))
      6     #The learning rate
      7     eta = 1

TypeError: object of type 'numpy.float64' has no len()

Any ideas? Is this even relevant? I'm still trying to wrap my head around some of the code...

Cheers

thisHermit commented 7 years ago

@sebkouba can you reproduce the problem when running from terminal instead of ipython? ie copying and pasting all the code in a new your-chosen-name-here.py then running it from the command prompt (for windows) or from the terminal (for macOS).

Anjalis9718 commented 5 years ago

@sebkouba try to put w = svm_sgd_plot(X,y) on the top after the end of function `w = svm_sgd_plot(X,y) for d, sample in enumerate(X):

Plot the negative samples (the first 2)

if d < 2:
    plt.scatter(sample[0], sample[1], s=120, marker='_', linewidths=2)
# Plot the positive samples (the last 3)
else:
    plt.scatter(sample[0], sample[1], s=120, marker='+', linewidths=2)

Add our test samples

plt.scatter(2,2, s=120, marker='_', linewidths=2, color='yellow') plt.scatter(4,3, s=120, marker='+', linewidths=2, color='blue')

Print the hyperplane calculated by svm_sgd()

x2=[w[0],w[1],-w[1],w[0]] print(x2) x3=[w[0],w[1],w[1],-w[0]] print(x3)

x2x3 =np.array([x2,x3]) X,Y,U,V = zip(*x2x3) ax = plt.gca() ax.quiver(X,Y,U,V,scale=1, color='red')

`