Open jbogp opened 4 years ago
Hi @jbogp
I've evaluated DeepLogo using mean average precision because Tensorflow Object Detection API provides an evaluation script for this metric. Do you know any publicly available script for global average preison/recall metric?
Hi @satojkovic I'm sorry, I wrote too fast I meant mean average precision.
@jbogp
Never mind. Here is my evaluation result. PascalBoxes_Precision/mAP@0.5IOU: 0.858025
If it's possible, could you share your evaluation results when you finished training?
Thanks!
Hi @satojkovic Sorry for the delayed reply. Cool, so I got DetectionBoxes_Precision/mAP@.50IOU: 0.8595 and DetectionBoxes_Precision/mAP@.75IOU:0.725 using faster_rcnn_inception_resnet
On the 0.5IOU it's pretty similar so I guess SSD which I think is simpler would be the better choice.
Hi @jbogp
Very interesting results! It might be a good idea to train with larger dataset than flickr_logos_27 when you use bigger models.
Hello, I'm currently iterating over DeepLogo with some more recent object detection models, and was wondering if you could share the global average precision/recall you obtained on the flickr logos 27 dataset after training DeepLogo?