The log-linear bucket generation scheme doesn't currently allow for us to observe values less than 1. It's important to allow values less than 1 in general. Specifically, if the unit of observation (seconds) is larger than the typical observed value (say 100 milliseconds or .1 seconds), the smallest bucket is not small enough to track this value in a meaningful way.
I propose that we reset the 'minimum' bucket value to .0001. If the unit of observation is 'seconds', that would allow for precision up to 100 microseconds. This seems reasonable to me. If a value smaller than .0001 is observed, the .0001 bucket will be incremented and the exact value will be added to the _sum field of the histogram.
The log-linear bucket generation scheme doesn't currently allow for us to observe values less than 1. It's important to allow values less than 1 in general. Specifically, if the unit of observation (seconds) is larger than the typical observed value (say 100 milliseconds or .1 seconds), the smallest bucket is not small enough to track this value in a meaningful way.
I propose that we reset the 'minimum' bucket value to .0001. If the unit of observation is 'seconds', that would allow for precision up to 100 microseconds. This seems reasonable to me. If a value smaller than .0001 is observed, the .0001 bucket will be incremented and the exact value will be added to the _sum field of the histogram.