Closed LeeCampbell closed 7 years ago
Make it easy for the default recording to be in Ticks, but if user's wanted to record values in different units then that should be easy too. I think that using ticks to measure over 2.5hrs could cause an overflow making a method like this inaccurate
public static void RecordLatency(this HistogramBase histogram, Action action)
{
var start = Stopwatch.GetTimestamp();
action();
var elapsedTicks = (Stopwatch.GetTimestamp() - start);
histogram.RecordValue(elapsedTicks);
}
Last comment is nonsense. The Stopwatch.GetTimestamp() returns ticks as a long. That give 29,000years of ticks.
Last comment is incorrect, Stopwatch.GetTimestamp();
returns a different type of Tick to TimeSpan.TicksPerSecond
which is the ticks unit used in DateTime
types.
Instead there are Stopwatch.Frequency
ticks/units per second when using Stopwatch.GetTimestamp()
which actually means we have up to ~88,000 years as a max value. Should be plenty.
Make it easy to measure with this as the default
TicksInAnHour
outputValueUnitScalingRatio
to go from ticks to milliseconds (should appease the Mort dev) eg.LongHistogram.OutputPercentileDistribution(Console.Out, outputValueUnitScalingRatio: TicksInAMillisecond);
Ideally we just want to get a Histogram
then call
then output the results
This allows consumers to not have to know about how it was constructed or how it will be charted