Open davidfowl opened 3 years ago
@ericsampson I thnik you nailed here the core of the problem
While from my standpoint, 99% of the cases would be solved by ConcurrentDictionary with expiration as they mainly revolve around small services where I simply don't want to hammer one particular resource too much with as small overhead as it can be.
On the other hand usecase of Nick gives me meme vibes ;)
One thing to consider is how much should be in the runtime as default. There are several caching libraries in the .NET ecosystem (FusionCache, CacheManager, Cache Tower, EasyCaching, LazyCache, MonkeyCache, and probably a bunch of others) which can handle the more complicated and feature rich scenarios.
That's a fair point :) If the .NET/ASP docs for caching can list these community packages, that would go a long way.
I am biased as the creator of one of those caching libraries but my view is that it is the simple/common scenario that the MemoryCache implementation should aim for - the
ConcurrentDictionary
with expiration scenario. It should be fast, it should be straight forward and a lot of people should use it, it just shouldn't be all-encompassing.
"straight forward" is an important qualifier :)
Maybe the theoretical MemoryCache extension library could have
GetOrAddLazy*
alongside the current re-entrant factory version, to help discoverability in the IDE etc. Because that's the biggest current footgun for people IME. Cheers
+1 current cache is hard to use, the cache stampede is very serious problem
current cache is hard to use, the cache stampede is very serious problem
Its possible to work around manually.
current cache is hard to use, the cache stampede is very serious problem
If you'd like to avoid the cache stampede problem you can take a look at some alternatives (in alphabetical order):
Hope this helps.
current cache is hard to use, the cache stampede is very serious problem
Its possible to work around manually.
sure, it can work around by lazy and lock etc, but not everyone can realize that it may have the problem of cache stampede , which will lead to serious consequences. whatever, it is not easy to use. I very hope to have a built-in solution , thank you for this proposal
current cache is hard to use, the cache stampede is very serious problem
Its possible to work around manually.
I know it's a fairly broad question where implementation may depend on uses cases, but what solution would you suggest when the cache key is dynamic ? (for example when you want to cache fetched user permissions so cache key could be user-17, user-18, ...). Having a SemaphoreSlim
per key seems complicated and in the long run implies a memory leak. Relying on Lazy<T>
semantics may work too depending on how the current MemoryCache is implemented.
current cache is hard to use, the cache stampede is very serious problem
Its possible to work around manually.
I know it's a fairly broad question where implementation may depend on uses cases, but what solution would you suggest when the cache key is dynamic ? (for example when you want to cache fetched user permissions so cache key could be user-17, user-18, ...). Having a
SemaphoreSlim
per key seems complicated and in the long run implies a memory leak. Relying onLazy<T>
semantics may work too depending on how the current MemoryCache is implemented.
One of these https://github.com/dotnet/runtime/issues/48567#issuecomment-885756928 ?
Thanks @jodydonetti, since our need is simply an IMemoryCache without cache stampede issue, nothing more, nothing less. We went with @StephenCleary solution in the end: https://gist.github.com/StephenCleary/39a2cd0aa3c705a984a4dbbea8275fe9
I like this solution, it's a slim wrapper on top of IMemoryCache and you can easily follow the code.
Consider something that can be extensible to support Azure Caching Guidance
There's a proposal for a new cache implementation here https://github.com/dotnet/extensions/issues/4766. Can those interested review it and leave comments?
Thanks David, will do!
Interesting that there is no mention of managing cache dependencies in this thread. Not that it would have to be baked into a new cache class, but still seems like a relevant design consideration. Is nobody really using a consistent pattern for this that they want their cache class to handle? Is everyone just opting for inline code in each application component requiring this, and managing its specific set of dependent caches?
ICacheEntry uses an unintuitive pattern for adding new entries
I was bitten by this just recently. I did not see any documentation in CreateEntry() nor on ICacheEntry that stated that the object needed to be disposed to commit it to the cache. As a result, the code I initially wrote didn't have working caching!
I would like to see this documentation clarified for the current version of the cache at least.
Background and Motivation
The MemoryCache implementation and interface leaves much to be desired. At the core of it, we ideally want to expose something more akin to
ConcurrentDictionary<TKey, TValue>
that supports expiration and can handle memory pressure. What we have right now has issues:Proposed API
The APIs are still TBD but I'm thinking a generic memory cache.
I'm convinced now that this shouldn't be an interface or an abstract class but I'm open to discussion.
Usage Examples
TBD
Alternative Designs
TBD
Risks
Having 3 implementations.
cc @Tratcher @JunTaoLuo @maryamariyan @eerhardt