i.e. if we know or approx the memory usage of each entry in Mb and want to bound the cache to X Mb. I see there's a fixed capacity internally on the concurrent dictionary - so probably not straightforward.
If we picked a suitable N for the capacity bound and did this psuedocode:
onUpdated += x => { total = Interlocked.Increment(x.Size); if(total > limit) lfu.Trim(currentSize*0.7); }
onRemoved += x => total = Interlocked.Decrement(x.Size)
any thoughts on the performance/stability?
Hello 👋
i.e. if we know or approx the memory usage of each entry in Mb and want to bound the cache to X Mb. I see there's a fixed capacity internally on the concurrent dictionary - so probably not straightforward.
If we picked a suitable N for the capacity bound and did this psuedocode:
onUpdated += x => { total = Interlocked.Increment(x.Size); if(total > limit) lfu.Trim(currentSize*0.7); } onRemoved += x => total = Interlocked.Decrement(x.Size) any thoughts on the performance/stability?