porterjenkins / deep_uncertainty

MIT License
2 stars 0 forks source link

Fixed calibration score so it is now between 0 and 1 #5

Closed spencermyoung513 closed 1 year ago

spencermyoung513 commented 1 year ago

I had an absolute value in the wrong place, and then realized that multiplying the "area difference" between the perfect calibration and a model's calibration by two makes it so this score lives from 0 to 1. Also, thought I'd give it a proper name: "Average Calibration Score" (inspired by Mean Average Precision since there are similar elements with aggregating over multiple thresholds)