Closed yscoffee closed 7 years ago
you should do svm_train(y,x) rather than x,y YS Wong writes:
I tried to run a very simple binary classification via the matlab interface of libsvm where
class A : [ 1, 1] class B : [-1,-1] and [ 1, -1 ]
but got wrong prediction results (compared to the python interface) cases [-1,-1] and [1,-1] are all wrong.
here is the sample code
N=500; A_pts = repmat([1,1],N*2,1); A_label = ones(size(A_pts,1),1);
B_pts = repmat([-1,-1],N,1); B_pts = cat(1,B_pts, repmat([1,-1],N,1));
B_label = -1*ones(size(B_pts,1),1);x = [ A_pts ; B_pts ]; y = [ A_label ; B_label ];
svmmodel = svmtrain(x,y); svmpredict(1,[1,1],svmmodel) svmpredict(-1,[-1,-1],svmmodel) % wrong svmpredict(-1,[1,-1],svmmodel) % wrong
output:
optimization finished, #iter = 500 nu = 0.500000 obj = -1000.000000, rho = -1.000000 nSV = 1000, nBSV = 1000 Total nSV = 1000
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.*
Thanks
I tried to run a very simple binary classification via the matlab interface of libsvm where
class A : [ 1, 1] class B : [-1,-1] and [ 1, -1 ]
but got wrong prediction results (compared to the python interface) cases [-1,-1] and [1,-1] are all wrong.
here is the sample code
output: