Closed lathashree01 closed 5 months ago
I am expecting the threshold is between 0 to 1
Anomaly detection models do not have their final scores bounded to the [0, 1] range. Instead, they have an anomaly score. In many models, a higher score indicates the likelihood of anomaly in the image. The threshold is the value set to draw a boundary between normal and anomalous image/pixel decision.
In your case, the adaptive threshold that achieves the highest anomaly detection performance is computed as 13.60. This means that any prediction score that is higher than this value would indicate anomaly, and normal otherwise.
Hope that makes sense.
Describe the bug
Once I finish the training, I would like to understand the threshold the model used for distinguishing the normal vs anomalous data.
If I print the F1adaptive threshold value, it says
I do have good performance metrics and the anomaly maps kind of correlates well with the confidence displayed. But I cant understand the threshold score. What does it mean when the threshold is 13.6?
Is there a way we can set the confidence param of the model during training and inference, because I want to have separate workflows based on confidence value.
Thanks, would really appreciate the help.
Dataset
Folder
Model
PatchCore
Steps to reproduce the behavior
Train patchcore model and do
engine.threshold
to see the threshold value used during training and testing.OS information
OS information:
Anomalib version 1.0.1 Python 3.10 Custom dataset: Folder
Expected behavior
I am expecting the threshold is between 0 to 1
Screenshots
No response
Pip/GitHub
pip
What version/branch did you use?
1.1.0
Configuration YAML
Logs
Code of Conduct