Open jleibs opened 6 months ago
I don't think it need to add latency if the "smoothed" value is set to alway over-estimate the time, i.e. be sightly ahead of the incoming time points. In fact, we HAVE to do that to always see the latest incoming data.
I just had a user-call where something very similar to this come up, but where the user wanted smooth playback but always showing data that was e.g. 2 seconds old, i.e. having a fixed-sized latency/buffer. The user story is this:
capture_time
timelinecapture_time
of the associated camera frameIt would be great if the X seconds could be automatically inferred from the incoming data, though that might get tricky.
Users are currently forced to choose between two modes of playback:
This is particularly noticeable in a windowed plot when viewing the data as part of a realtime stream. Any burstiness in the data, such as from network jitter, or from larger payloads like images in the data-stream cause the plot to feel stuttery.
It would be nice to have the option to turn on some degree of smoothing in follow mode at the cost of a bit of additional latency. We should be able to dynamically track both a max-latency and rate of progression of the current timeline in order to progress time forward as smoothly as possible while accounting for burstiness.
In the case of large outliers, we could then jump to the end and reset the estimator.