pjreddie / darknet

Convolutional Neural Networks
http://pjreddie.com/darknet/
Other
25.52k stars 21.33k forks source link

Validating IoU with recall #515

Open rafaelpadilla opened 6 years ago

rafaelpadilla commented 6 years ago

I found that the ./darknet detector recall function verifies the Intersection Over Union in a strange way.

It points out to the function validate_detector_recall in Line 491 file detector.c .

It seems not to taking into account the class of the detected bounding box. It means that if an object of the class "dog" (for instance), is classified as a "house", but its bounding box intersects well with its ground truth, will result in a high IoU.

It also sets the threshold to 0.001 and the nms to 0.4, which are values that will lean to a high score.

Please, correct me if I'm wrong, but shouldn't the class of the detected object be considered?

Mr. Pad

AlexeyAB commented 6 years ago

@rafaelpadilla You are right. ./darknet detector recall gives internal accuracy indicators that can not be compared with generally accepted indicators.

If you want to get mAP, IoU, Precision, Recall, F1 - calculated in a conventional way, you can use this fork: https://github.com/AlexeyAB/darknet

And use this commend: ./darknet detector map data/obj.data yolo-obj.cfg backup\yolo-obj_7000.weights

rafaelpadilla commented 6 years ago

@AlexeyAB Thanks for your reply, but how does your fork calculate IoU? Does it take into consideration the class of the detected object? "cats" will be compared to "cats" and "dogs" will be compared to "dogs"?

What happens when there are two or more bounding boxes over the same detection?

AlexeyAB commented 6 years ago

@rafaelpadilla

Does it take into consideration the class of the detected object?

Yes. If detected "cat" isn't overlaped with "cat" (or overlaped with "dog"), then this is false_positive and current_IoU = 0.

What happens when there are two or more bounding boxes over the same detection?

Then only maximum IoU will be added to IoU_sum. All non-maximum IoUs = 0 and they are false_positives.

And at the end: IoU_avg = IoU_sum / (True_positives + False_positives);


Source code: https://github.com/AlexeyAB/darknet/blob/100d6f78011f0a773442411e2882a0203d390585/src/detector.c#L649-L704

And: https://github.com/AlexeyAB/darknet/blob/100d6f78011f0a773442411e2882a0203d390585/src/detector.c#L715

SergeyHayrapetyaNN commented 6 years ago

@AlexeyAB Hello Alexey, I am wondering, is there any way to perform estimation on validatoin set during the training. So that we can save the weights if we have improvement on the validation set.

AlexeyAB commented 6 years ago

@sergYaNN Isn't yet.

SergeyHayrapetyaNN commented 6 years ago

@AlexeyAB Thank you for the reply. I have impelemented smth on V2, and now working on that for V3. I am passing the network to a "validation" function and performing validation and deciding to save the weights or not. However, there is new function fuse_conv_batchnorm instead of set_batch_network(&net, 1). Can you please explain the purpose of merging batch_norm layers with conv layers for evaluation?

2). Or at least I would like to deep copy the network so that during the training validation function will not influence on the parameters of the network

AlexeyAB commented 6 years ago

@sergYaNN

However, there is new function fuse_conv_batchnorm instead of set_batch_network(&net, 1).

There is network net = parse_network_cfg_custom(cfgfile, 1); instead of 2 lines:

network net = parse_network_cfg(cfgfile);
set_batch_network(&net, 1);

But fuse_conv_batchnorm(net); is for merging Convolutional and Batch-norm weights into Convolutional-weights only, it increases performance +7% for Detection. But it can't be applied for training.

If you want to call Validation from the Training, just remove or comment this function fuse_conv_batchnorm(net);

SergeyHayrapetyaNN commented 6 years ago

@AlexeyAB Okay, thanks.

SergeyHayrapetyaNN commented 6 years ago

@AlexeyAB If you do not mind I will create an issue on the repo then)

srhtyldz commented 5 years ago

@AlexeyAB How did you draw the mAP chart while training ? i put ' -map ' while training but how could you get the chart ? can you please help for that ?

adriacabeza commented 5 years ago

@srhtyldz are you sure that you are using the @AlexeyAB's fork? Maybe you are still working on pjreddie's fork where it is not implemented.

srhtyldz commented 5 years ago

@srhtyldz are you sure that you are using the @AlexeyAB's fork? Maybe you are still working on pjreddie's fork where it is not implemented.

Yes , I do. My question is about how can I get the chart ? where is it created in the Darknet folder ? I mean graph or chart.

adriacabeza commented 5 years ago

Hummmm @srhtyldz I do not know it for sure because I have not checked all the source code, however, with a naked eye, I do not see any path where the chart is saved in the function float validate_detector_map(char *datacfg, char *cfgfile, char *weightfile, float thresh_calc_avg_iou, const float iou_thresh, const int map_points, network *existing_net).

Please @AlexeyAB correct me if I am wrong

srhtyldz commented 5 years ago

I got the chart like that but I don't know if its good or not .I couldn't understand that.Is that normal ?

chart_1

yudapp commented 4 years ago

@srhtyldz did you find how to get the mAP chart? I did -map on a remote server but did not get anything. Is it saved somewhere, the mAP, Iou etc?

priyankasinghvi commented 4 years ago

@AlexeyAB I tried using detector map and detector recall. But I am unable to print graphs. It shows the error as follows : It says that the Label file name is too short. Can't open label file name. I want to be able to plot Confusion Matrix, IoU, classification/misclassification etc. How can I do the same? Though I am able to plot mAP with the training loss. Capture00 Thank you

ghost commented 4 years ago

did anyone know how to draw or plot the P-R curve for yolo prediction output? most papers show their P-R diagram but I do not know which function of the AlexeyABs repo they use?

LucWuytens commented 4 years ago

Then only maximum IoU will be added to IoU_sum. All non-maximum IoUs = 0 and they are false_positives. @AlexeyAB
Hi Alexey, I have question about your statements above. You mention that all non-maximum IoUs = 0 and are false positives. Do you mean this only when those boxes predict the same class? Isn't non-max surpression supposed to prevent these alternative boxes of occurring before it is classified as false positive? If you mean two different boxes for two different classes, then I have an additional question: When the object is a cat and two boxes are predicted, one box says dog and the other one says cat. If the dog prediction has the highest class confidence score, so higher than the cat, then the dog prediction results in a false positive for the first box. However the second box is still considered as a True Positive in the output of 'darknet.exe detector map' according to my experiments. Can you confirm that? What I don't know for sure is how this second-box 'lower ranked' TP prediction influences mAP? I assume it is just ranked (as a TP) like other predictions and therefor part of mAP calculations. thanks for reply, Luc