Open swiergot opened 2 years ago
So the question is: why are the timestamps expected to be different?
Because diff is supposed to work with the last and the first entry of a bucket and I didn't test that feature with the statistics. I can make an exception for the statistics, but I won't change the behavior for historical data as it doesn't make sense to me. The timestamp is expected to be different because if there is a value (for historical data) on the timestamp of the bucket it means that it's the first value of the bucket and nothing else can be "first" in that bucket.
Thanks for your reply.
The timestamp is expected to be different because if there is a value (for historical data) on the timestamp of the bucket it means that it's the first value of the bucket and nothing else can be "first" in that bucket.
Isn't that in contradiction with the very purpose of start_with_last? I mean, even if there cannot be any earlier value in the bucket, start_with_last actually adds a value from the previous bucket (as the documentation states). I think that's the behaviour one would always expect when using start_with_last, regardless of whether originally the first value was on the bucket's timestamp or not.
To illustrate my point, suppose we have a sensor that yields a new, increasing value every 10 minutes (and resets at midnight).
11:40:01 0 11:50:01 0 12:00:01 1 12:10:01 2 12:20:01 3 12:30:01 4 12:40:01 5 12:50:01 6 13:00:01 7 13:10:01 8 13:20:01 9 13:30:01 10 13:40:01 10 13:50:01 10
With start_with_last set to true, the diff function will show 6 for hour 12 and 4 for hour 13.
But now suppose the sensor updates 1 second earlier (at full minutes):
11:40:00 0 11:50:00 0 12:00:00 1 12:10:00 2 12:20:00 3 12:30:00 4 12:40:00 5 12:50:00 6 13:00:00 7 13:10:00 8 13:20:00 9 13:30:00 10 13:40:00 10 13:50:00 10
Due to timestamp checking, start_with_last will have no effect here - the diff function will show 5 (6-1) for hour 12 and 3 (10-7) for hour 13.
I would expect that the sum of hourly diffs will be equal to the daily gain (which is 10 in our example). It's true in the first example, but not in the second one, just because the sensor is just 1 second faster. It seems not justified to me.
I see your point, thanks. I'll update the behavior!
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
Hi, I have created a PR for you.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This issue was closed because it has been stalled for 10 days with no activity.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This issue was closed because it has been stalled for 10 days with no activity.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
Still unclosed.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
Still unclosed.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This issue was closed because it has been stalled for 10 days with no activity.
Reopen please.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
Still present.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This is not working as expected:
It just shows zeros throughout the entire graphspan.
I have debugged the code and the reason is the third IF statement here:
Basically, when retrieving hourly long term statistics, there is only one datapoint in each bucket and the datapoint's timestamp is always equal to the bucket's timestamp.
So the question is: why are the timestamps expected to be different?