Open damienfernandes opened 4 months ago
@demeringo To provide scientific rigor when presenting experimental results 🧪
I fully agree that returning the uncertaincy is important.
I think we need to do 3 things:
xxx_absolute_uncertaincy
metric for each existing `xxx metric (except maybe for the exact counters like number of instance etc...)@damienfernandes I realize that this issue, seems realy related to cloud scanner (not strickly boavizta API which already returns uncertaincy figures).
Do you agree ? I yes I can transfer the issue to cloud-scanner repository.
Problem
In prometheus, we don't scrape error margin for collected impacts values.
Solution
To be able to explicit result, I will be interesting to add error margin and be able to display it in grafana. Absolut Unvertainty will be calculated with rawdata bloc (min, max, value) for every impacts
Alternatives
Investigate if we add new metrics or we change value metrics to scrape Value ± Absolute Uncertainty
Additional context or elements
Absolute Uncertainty