biigle / maia

:m: BIIGLE module for the Machine Learning Assisted Image Annotation method
GNU General Public License v3.0
2 stars 3 forks source link

Investigate classification performance #48

Open mzur opened 4 years ago

mzur commented 4 years ago

Investigate how Mask R-CNN classification performs and if it can be optionally enabled for MAIA. This would require additional steps during training proposal selection (see #42). However, if existing annotations are provided as training proposals, the annotation labels come for free. Maybe classification can be offered for this case only. The detection performance must not be reduced by this, though (this is what requires investigation).

mzur commented 3 years ago

References biigle/core#170

mzur commented 1 year ago

This topic was part of a recent master thesis with the following results: Mask R-CNN was trained with a dataset from UnKnoT (that now included annotation labels, too). The classification performance was quite poor and also the detection performance was worse than the baseline (MRCNN on the same data with only a single class for all annotations). Even class imbalance mitigation techniques didn't really improve this.

One thing did work, though. If the training dataset was processed with the class imbalance mitigation techniques and then the class clabels were discarded (so MRCNN was still trained with a single class only), the detection precision increased (and the recall stayed the same) compared to the baseline.

Another (somewhat expected) result was that users took many more proposed detections and labels "for granted" compared to when they had to review detections without class labels. This way they found many more interesting objects (in the default workflow a large percentage of true positives is overlooked) but they also overlooked many false positives.

mzur commented 3 weeks ago

This is now on the roadmap again. We should investigate automatic classification with LabelBOT.