Currently rates in statistics are computed over the last 10 changes, for example:
All Interface LIE FSMs:
+------------------------------------------------------------+-----------------+-------------------------+-------------------+
| Description | Value | Last Rate | Last Change |
| | | Over Last 10 Changes | |
+------------------------------------------------------------+-----------------+-------------------------+-------------------+
| Events TIMER_TICK | 224 Events | 68.08 Events/Sec | 0d 00h:00m:00.41s |
+------------------------------------------------------------+-----------------+-------------------------+-------------------+
This leads to very misleading results.
For example, if there is a very quick burst of 10 changes at some point, the rate becomes extremely high. Then if there are zero changes forever after that, the rate continues to be reported as very high forever.
Instead, compute the rate over the last 10 seconds.
Currently rates in statistics are computed over the last 10 changes, for example:
This leads to very misleading results.
For example, if there is a very quick burst of 10 changes at some point, the rate becomes extremely high. Then if there are zero changes forever after that, the rate continues to be reported as very high forever.
Instead, compute the rate over the last 10 seconds.