The net:cal calibration framework is a Python 3 library for measuring and mitigating miscalibration of uncertainty estimates, e.g., by a neural network.
insert here any model to calculate the confidence array, I got this error with multiple different models in multiple different datasets for binary classification
`
sometimes the y_conf_bbq would contain values that go outside 0 and 1, I suspect that it is a floating point error since when I tested to see what numbers it gave outside the [0,1] range I got 1.0000000000000002, but as it was relatively rare I did not try multiple times to see wether different anomalous values are possible.
If indeed it is a floating point error simply clipping the output should be fine to fix this error.
Code to reproduce issue: `
insert here any model to calculate the confidence array, I got this error with multiple different models in multiple different datasets for binary classification
` sometimes the y_conf_bbq would contain values that go outside 0 and 1, I suspect that it is a floating point error since when I tested to see what numbers it gave outside the [0,1] range I got 1.0000000000000002, but as it was relatively rare I did not try multiple times to see wether different anomalous values are possible. If indeed it is a floating point error simply clipping the output should be fine to fix this error.