Open kbenahmed89 opened 8 years ago
Hi, that depends on how different they are. Small differences are accounted for by the non-deterministic nature of some GPU computations due to variable ordering of the arithmetic. Is the difference negligible?
On 5 Feb 2016, at 00:11, kbenahmed89 notifications@github.com wrote:
Hello,
I'm using imagenet-vgg-f pretrained network, and when I execute vl_simplenn on different machines I get different feature vectors what is wrong ?
my code is for i=1:22 im_ = single(im) ;
im2 = imresize(im, netmetanormalizationimageSize(1:2)) ; im_3 = im_2 - netmetanormalizationaverageImage(:,:,1) ; res = vl_simplenn(net, im_3) ; T=res(19)x(:); MyResult(i,:)=T'; end
the variables im_,im_2 and im_3 are similar in the two machines, but the vector MyResult is different every time I change the machine used is it a bug?
can you help please?
— Reply to this email directly or view it on GitHub https://github.com/vlfeat/matconvnet/issues/402.
thank you for answering, I used CPU not GP. the difference are in the range of 10-6 but when I use thse features for classification , the difference in classification results is obvious.
I'm still getting different results on different machines using CPU while extracting features from the 7th layer of vgg-F pretrained network. can you help please.
Hello,
I'm using imagenet-vgg-f pretrained network, and when I execute vl_simplenn on different machines I get different feature vectors. what is wrong ?
my code is for i=1:22 im_ = single(im) ;
im2 = imresize(im, net.meta.normalization.imageSize(1:2)) ; im_3 = im_2 - net.meta.normalization.averageImage(:,:,1) ; res = vl_simplenn(net, im_3) ; T=res(19).x(:); MyResult(i,:)=T'; end
the variables im_,im_2 and im_3 are similar in the two machines, but the vector MyResult is different every time I change the machine used. is it a bug?
can you help please?