The net:cal calibration framework is a Python 3 library for measuring and mitigating miscalibration of uncertainty estimates, e.g., by a neural network.
Hello, thank you for this greate toolbox. However, I have the same problem as described here (https://github.com/EFS-OpenSource/calibration-framework/issues/48#issue-1964505988). This is mainly due to the method for the calibration mapping. For method=mle the ECE (and the other metrics) is equal to the uncalibrated ECE. This applies to both TemperatureScaling and LogisticCalibration. If you change the method to mcmc, the problem no longer occurs. I am currently using version 1.3.6 of netcal. Below is an example code based on your example and the ReadMe:
import numpy as np
from netcal.metrics import ECE
from netcal.scaling import TemperatureScaling, LogisticCalibration
from sklearn.model_selection import train_test_split
# load data
input = np.load("records/cifar100/wideresnet-16-4-cifar-100.npz")
predictions = input['predictions']
ground_truth = input['ground_truth']
# split data set into build set and validation set
pred_train, pred_val, lbl_train, lbl_val = train_test_split(predictions, ground_truth,
test_size=0.7,
stratify=ground_truth,
random_state=None)
# apply TS
temperature = TemperatureScaling(detection=False, use_cuda=True, method='mle')
temperature.fit(pred_train, lbl_train)
calibrated_ts = temperature.transform(pred_val)
apply LR
lr = LogisticCalibration(detection=False, use_cuda=True, method='mle')
lr.fit(pred_train, lbl_train)
calibrated_lr = lr.transform(pred_val)
# Evaluate
n_bins = 10
ece = ECE(n_bins)
uncalibrated_score = ece.measure(pred_val, lbl_val)
calibrated_score_ts = ece.measure(calibrated_ts, lbl_val)
calibrated_score_lr = ece.measure(calibrated_lr, lbl_val)
print(f'uncalibrated ECE: {uncalibrated_score}')
print(f'calibrated ECE with TS: {calibrated_score_ts}')
print(f'calibrated ECE with LR: {calibrated_score_lr}')
The output:
uncalibrated ECE: 0.05723183405505762
calibrated ECE with TS: 0.05723081579165799
calibrated ECE with LR: 0.05723081579165799
Hello, thank you for this greate toolbox. However, I have the same problem as described here (https://github.com/EFS-OpenSource/calibration-framework/issues/48#issue-1964505988). This is mainly due to the method for the calibration mapping. For
method=mle
the ECE (and the other metrics) is equal to the uncalibrated ECE. This applies to both TemperatureScaling and LogisticCalibration. If you change the method tomcmc
, the problem no longer occurs. I am currently using version 1.3.6 ofnetcal
. Below is an example code based on your example and the ReadMe:The output:
using
mcmc
instead ofmle
: