gavv / signal-estimator

Measure characteristics of a looped back signal.
MIT License
67 stars 20 forks source link

Think of a better loss ratio estmation algorithm #35

Open gavv opened 1 year ago

gavv commented 1 year ago

In loss estimation mode (-m losses), we want to measure how much "glitches" the loopback introduce, e.g. caused by packet losses, buffer underruns/overruns, and similar.

Current loss estimation algorithm is very naive and imprecise. We produce a continuous sine wave and measure how often per second there is a "glitch" defined as a window during which there are no samples with high amplitude. (It is implemented by applying a running maximum).

This kind of works, but measurements are unstable and it's not very clear how to interpret its units.

We need to invent a better loss ratio metric and an algorithm for its computation.

References: