EFS-OpenSource / calibration-framework

The net:cal calibration framework is a Python 3 library for measuring and mitigating miscalibration of uncertainty estimates, e.g., by a neural network.
https://efs-opensource.github.io/calibration-framework/
Apache License 2.0
344 stars 42 forks source link

LogisticCalibration use _inverse_sigmoid #20

Closed JYLFamily closed 3 years ago

JYLFamily commented 3 years ago

Hi Fabian Küppers when use LogisticCalibration() for binary class, why use _inverse_sigmoid(X) rather than X.

version 1.0 netcal    
`        # if binary, use sigmoid instead of softmax
    if self.num_classes <= 2 or self.independent_probabilities:
        logit = self._inverse_sigmoid(X) 
    else:
        logit = self._inverse_softmax(X)

    # otherwise, use SciPy optimzation. Usually, this is much faster
    if self.num_classes > 2:
        # convert ground truth to one hot if not binary
        y = self._get_one_hot_encoded_labels(y, self.num_classes)

    # if temperature scaling, fit single parameter
    if self.temperature_only:
        theta_0 = np.array(1.0)

    # else fit bias and weights for each class (one parameter on binary)
    else:
        if self._is_binary_classification():
            theta_0 = np.array([0.0, 1.0])
        else:
            theta_0 = np.concatenate((np.zeros(self.num_classes), np.ones(self.num_classes)))

    # perform minimization of squared loss - invoke SciPy optimization suite
    result = optimize.minimize(fun=self._loss_function, x0=theta_0,
                               args=(logit, y))`

Thanks

fabiankueppers commented 3 years ago

Hi @JYLFamily, logistic calibration aka Platt scaling, as well as temperature scaling, are both defined for the logits of the sigmoid - that is the raw output of a neural network before applying the activation function. Therefore, it is necessary to compute the inverse activation functions to obtain the logits.

jwitos commented 3 years ago

@fabiankueppers, just to confirm, all methods in the library expect confidences and not logits, correct?

fabiankueppers commented 3 years ago

@jwitos exactly