vlfeat / matconvnet

MatConvNet: CNNs for MATLAB
Other
1.4k stars 753 forks source link

Evaluating a trained DagNN alexnet model on my own dataset #876

Open ghost opened 7 years ago

ghost commented 7 years ago

I have trained a alexnet DagNN model from the scratch with my own dataset. I went through the code available at matconvnet / examples / imagenet / cnn_imagenet_evaluate.m to identify how to evaluate the trained model, but it is difficult for me to understand the code, as a beginner. Kindly let me know the simplified code to evaluate the trained DagNN model ( not simpleNN) model on my data (say imdb.mat). Thanks for your assistance.

layumi commented 7 years ago

@raaju-shiv You can refer to my code. In fact, the main function is 'getFeature2' and 'eval'.

ghost commented 7 years ago

Thanks for the reply. I could find the model is valuated with a arbitrary input in your case. I am attempting to generate a confusion matrix for evaluating the performance of the trained model. I have a trained model and my IMDB data structure now. I would like to compute the confusion matrix for the trained DagNN model. I could find examples of generating confusion matrix for simpleNN models, like the one shown here:

disp('loading imdb...');

imdb = load('imdb.mat'); % load the MatConvNet examples but not imdb here

% load trained model

disp('loading pretrained model...');

model = load('net.mat'); %load the trained model here

model.net = vl_simplenn_tidy(model); %get the network for DagNN models computing

testSet = imdb.images.set(); % get the indice of the test set

testImgs = imdb.images.data(:,:,:,testSet == 2);

testLabels = imdb.images.labels(:,testSet == 2);

model.net.layers{end}.type = 'softmax'; % change from softmaxloss

predictLabel = zeros(size(testImgs,4),1);

dummy_test = dummyvar(testLabels');

total = size(testImgs,4);

res = vl_simplenn(model.net, testImgs(:,:,:,1:total)) ;

scores = squeeze(gather(res(end).x)) ;

bestScores = max(scores);

disp('performing predictions...');

for i = 1 : size(testImgs,4)

maxScores = max(bestScores(:,:,:,i));

[maxScore index] = max(maxScores);

predictLabel(i) = index;

end

dummy_predict = dummyvar(predictLabel);

figure

plotconfusion(dummy_test', dummy_predict')

It would be great if you can help redesign the code to generate the confusion matrix for my dagNN model. Thanks for your valuable assistance...........

layumi commented 7 years ago

@raaju-shiv I think you may try it by yourself. I can just give you some pseudo code.

%---1. load DagNN model netStruct = load('your mat file'); net = dagnn.DagNN.loadobj(netStruct.net); net.mode = 'test' ; net.move('gpu') ; net.conserveMemory = false; im_mean = net.meta(1).normalization.averageImage; %---2. load data

%---3. get softmax score net.addLayer('softmax',dagNN.softmax() ,{'prediction'},{'prediction_softmax'},{}); for i=1:numel( your test data) f = getFeature2(net,oim,im_mean,'data','prediction_softmax'); % eval data. This function can be found in getFeature2. f = reshape(f,1,[]); predict_score(i,:) = f; end %---4. draw confusion map plotconfusion(prediction_score, real_score);

ghost commented 7 years ago

Hi Many thanks for your reply. I tried with my own code but I'm getting stuck with the getfeature2 function because my dagNN model does not contain any meta.normalization as I am doing it at the very process of creating my imdb. Here is my modified code: load('s.mat') % loading the dagNN model saved load('imdb_5000.mat') % loading my data imdb = normalizeIMDB(imdb) % normalizaing my imdb with a function net=dagnn.DagNN.loadobj(s); net.mode='test'; testSet=imdb.images.set(); %taking the stestSet testImgs=imdb.images.data(:,:,:,testSet==2); % taking the validation set alone testlabels=imdb.images.labels(:,testSet==2); net.addLayer('softmax',dagnn.SoftMax(),{'prediction'},{'prediction_softmax'},{}); predictLabel=zeros(size(testImgs,4),1); %predicting the labels dummy_test=dummyvar(testlabels'); %getting the true values total=size(testImgs,4); % total size of the validation set f=getFeature2(net,testImgs,'data','prediction_softmax'); % error stucks I am not able to evaluate this trained model The error is because my dagNN model does not have a meta.normalization. Also, I don't use a gpu, but CPU. Kindly assist computing the confusion matrix in analogy to the one mentioned in the simpleNN example program I have mentioned above. Thanks for your assistance.

ghost commented 7 years ago

Also, in my case, the image is not a single image file but I am loading a imdb.mat file and taking the test images with its labels(validation==2). PFA my dag model for your kind reference. My imdb contains 10000 images in total, belonging to two separate classes. I have split them into training(==1) and validation(==2) sets. I am not able to attach it here because it is 20mb in size. s (1).zip Kindly help with your modified getfeature2 function that supports this requirement and as I requested above. Your assistance with this work will be very much useful for proceeding with my project. Kindly help. Many thanks.

layumi commented 7 years ago

@raaju-shiv val_index = find(imdb.images.set==2); %get the index of validation images for i=1:numel(val_index) %loop val test_image = imdb.images.data(:,:,:,val_index); % get one image(may be 224*224*3) f=getFeature2(net,test_image,'data','prediction_softmax'); % the 'data' is your input name, may be named as 'x0' in your network, you need to check it. the 'prediction_softmax' is the output name, you can change it to what you want to extract in your network. % check net.vars.names~~ [~,max_index] = max(f,[],3); %if you want to get the label with max possibility. end

ghost commented 7 years ago

I am attempting to make a five-class classification with my trained network. I have a imdb creation code that works only for binary classification It would of immense help if you can help me with your matlab program code for creating a imdb structure for five classes. Many thanks for your assistance.

ghost commented 7 years ago

I am attempting to make a five-class classification with my trained network. I have a imdb creation code that works only for binary classification It would of immense help if you can help me with your matlab program code for creating a imdb structure for five classes. Many thanks for your assistance.

OluwoleOyetoke commented 7 years ago

@raaju-shiv HAve you figured out how to do this? I could be of help

ghost commented 7 years ago

@OluwoleOyetoke Would be great if you can help with the evaluation of a DagNN model. I'm yet to receive a convincing answer for this. I can evaluate a simpleNN model but not the DagNN. Thanks for your assistance.

OluwoleOyetoke commented 7 years ago

@raaju-shiv Narrow down on what exactly you want to evaluate. Do you mean you want to build a network based on the DagNN model? or you just want to check out network parameters?

OluwoleOyetoke commented 7 years ago

By the way, @raaju-shiv have you ever had an issue whereby your network is not converging. After every epoch, I get the same error value. A pure straight line graph. No decent in error is evident. Do you know why this may be?

My learning rate is sufficiently small (0.0005), and I am using 'He's' initialization technique

h612 commented 7 years ago

@OluwoleOyetoke did u find a solution...my net isnt converging either. Its a straight line. Im using regression with euclideqn loss

OluwoleOyetoke commented 7 years ago

Hi @h612 , It didn't work for me also...I had to resolve to using a simple NN

hayatibis commented 7 years ago

Hi @layumi , your solution did work for me, so Thanks,

The point what I missed is precious needs to be true for extract feature from the layer.

I couldn't see any explanation in documentation about that.