brayniac / histogram

rust histogram and percentile stats
Apache License 2.0
46 stars 4 forks source link

Clarify config.max_memory limit #17

Closed xitep closed 8 years ago

xitep commented 8 years ago

Hello, this is more of a question rather than an issue:

I'm not understanding why in Histogram::configured we validate buckets_total * mem::size_of::<HistogramBucket>(). Isn't the histogram's data stored as a u64 per bucket (while a HistogramBucket has 24 bytes?) What is the exact idea behind config.max_memory?

Many thanks in advance, P.

brayniac commented 8 years ago

You are correct, that appears to be a bug. The max_memory is supposed to allow us to cap the total memory consumption of the Histogram prior to allocation, returning a None instead of initializing the histogram. It looks like it should be 8 * buckets_total + mem::size_of::<HistogramConfig>() + mem::size_of::<HistogramProperties>() + mem::size_of<HistogramCounters>()

xitep commented 8 years ago

many thanks for your response. i think the check should then be merely on mem::size_of::<u64>() * buckets_total since only the "data vector" is the dynamic data which is being prevented to eat up too much memory and is allocated on the heap, while the rest is constant and very likely to be stored on the stack by client code. what do you think?

brayniac commented 8 years ago

I suspect that would be acceptable for most use-cases. I'm not aware of anybody using this currently, but it's clear the current math overstates the size of the Histogram.

xitep commented 8 years ago

thank you.