p-lambda / verified_calibration

Calibration library and code for the paper: Verified Uncertainty Calibration. Ananya Kumar, Percy Liang, Tengyu Ma. NeurIPS 2019 (Spotlight).
MIT License
141 stars 20 forks source link

Calculate calibration error with softmax distribution and missing class #7

Closed mpitropov closed 3 years ago

mpitropov commented 3 years ago

For example, this works:

>>> l1 = [0.8,0.1,0.1]
>>> l0 = [0.8,0.1,0.1]
>>> l1 = [0.7,0.2,0.1]
>>> l2 = [0.3,0.3,0.3]
>>> cal.get_calibration_error([l0,l1,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2], [0,2,1,1,1,1,1,1,1,1,1,1,1,1,1])
0.4353797831268186

But removing class 0 will give an error:

>>> cal.get_calibration_error([l0,l1,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2,l2], [1,2,1,1,1,1,1,1,1,1,1,1,1,1,1])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/matthew/anaconda3/lib/python3.7/site-packages/calibration/utils.py", line 125, in get_calibration_error
    return get_binning_ce(probs, labels, p, debias, mode=mode)
  File "/home/matthew/anaconda3/lib/python3.7/site-packages/calibration/utils.py", line 193, in get_binning_ce
    return _get_ce(probs, labels, p, debias, None, binning_scheme=get_discrete_bins, mode=mode)
  File "/home/matthew/anaconda3/lib/python3.7/site-packages/calibration/utils.py", line 236, in _get_ce
    labels_one_hot = get_labels_one_hot(labels, k=probs.shape[1])
  File "/home/matthew/anaconda3/lib/python3.7/site-packages/calibration/utils.py", line 509, in get_labels_one_hot
    assert np.min(labels) == 0
AssertionError

I have a simple fix I can open in a PR, but I'm not sure if it is valid.