Open donbowman opened 1 year ago
AH! So you're querying values right in the database 🤩
So to be clear, for all measurements, you have 4 columns: *_sum
(which is … the sum), then *_good
, *_impr
and *_poor
which are number of samples in the Good, Needs Improvements and Poor (Google) classification.
If you want the total number of samples, just add *_good
, *_impr
and *_poor
values.
thanks for the explanation.
the underlying issue, i have a page which has not a lot of traffic. Some bot opens it, and low and slow reads it for 9000s (yes i have this datapoint!). this completely skews my results since its averaged in against a small number of 0.9s page loads.
so i'm suggesting a couple of features:
ideally I would be able to sort by popular pages with low score.
another option would be a metric like holtz-winters prediction, where no one data point outweighs the others dramatically
Describe the solution you'd like
I have an issue where some pages on my site that are not super popular are visited by some bot once in a while that has a very slow client side. this gives e.g. FCP in the thousands of seconds. And, it confuses me looking at the results page.
I think having a column on the reports page showing the # of samples that contribute to the number would help. And a filter that would allow hiding rows with less than some number of samples.
It is not clear to me what e.g. the FCP column refers to. Is it the average of the samples? the 75%ile? the median?
I would like to not see results that don't have some confidence to them.
I think also some method to remove or ignore outliers would be useful.
Example: