RolnickLab / ami-ml

Software, algorithms and documentation related to the Automated Monitoring of Insects using deep learning and other machine learning methods.
MIT License
1 stars 1 forks source link

Confidence Calibration #43

Open adityajain07 opened 1 week ago

adityajain07 commented 1 week ago

Using temperature, per-species accuracy and num-samples per species, design a formula to display a calibrated confidence from the softmax function.

mihow commented 1 week ago

Temperature calibration. Setting all the configuration details aside, it is known that the NNs tend to be “too confident” when predicting the classes [GPSW17, MDR+21]. Here, confidence means the probability of the correctness of the prediction (e.g., softmax probability for the predicted class). In deep learning-related literature, methods have been developed to calibrate the confidence, i.e., going closer to the true probability. One simplest and common calibration factor is called the temperature, T, which “softens” the softmax in a way that T → 1 indicates the estimated output probability is close to its true probability cf. [GPSW17].

I believe this is the paper that introduced this concept of temperature calibration and outlines a practical implementation (2017) https://arxiv.org/abs/1706.04599 Here is survey of multiple methods (updated in 2024) https://arxiv.org/abs/2308.01222