dotnet / runtime

.NET is a cross-platform runtime for cloud, mobile, desktop, and IoT apps.
https://docs.microsoft.com/dotnet/core/
MIT License
15.12k stars 4.7k forks source link

New memory cache implementation #48567

Open davidfowl opened 3 years ago

davidfowl commented 3 years ago

Background and Motivation

The MemoryCache implementation and interface leaves much to be desired. At the core of it, we ideally want to expose something more akin to ConcurrentDictionary<TKey, TValue> that supports expiration and can handle memory pressure. What we have right now has issues:

Proposed API

The APIs are still TBD but I'm thinking a generic memory cache.

namespace Microsoft.Extensons.Caching
{
    public class MemoryCache<TKey, TValue>
    {
        public TValue this[TKey key] { get; set; }
        public bool IsEmpty { get; }
        public int Count { get; }
        public ICollection<TKey> Keys { get; }
        public ICollection<TValue> Values { get; }
        public void Clear();
        public bool ContainsKey(TKey key);
        public IEnumerator<KeyValuePair<TKey, TValue>> GetEnumerator();
        public KeyValuePair<TKey, TValue>[] ToArray();

        public bool TryAdd(TKey key, CacheEntry<TValue> value);
        public bool TryGetValue(TKey key, [MaybeNullWhen(false)] out TValue value);
        public bool TryRemove(TKey key, [MaybeNullWhen(false)] out TValue value);
    }

    public class CacheEntry<TValue>
    {
        TValue Value { get; set; }
        DateTimeOffset? AbsoluteExpiration { get; set; }
        TimeSpan? AbsoluteExpirationRelativeToNow { get; set; }
        TimeSpan? SlidingExpiration { get; set; }
        IList<IChangeToken> ExpirationTokens { get; }
        IList<PostEvictionCallbackRegistration> PostEvictionCallbacks { get; }
        CacheItemPriority Priority { get; set; }
        long? Size { get; set; }
    }
}

I'm convinced now that this shouldn't be an interface or an abstract class but I'm open to discussion.

Usage Examples

TBD

Alternative Designs

TBD

Risks

Having 3 implementations.

cc @Tratcher @JunTaoLuo @maryamariyan @eerhardt

zawor commented 3 years ago

@ericsampson I thnik you nailed here the core of the problem

While from my standpoint, 99% of the cases would be solved by ConcurrentDictionary with expiration as they mainly revolve around small services where I simply don't want to hammer one particular resource too much with as small overhead as it can be.

On the other hand usecase of Nick gives me meme vibes ;) image

ericsampson commented 3 years ago

One thing to consider is how much should be in the runtime as default. There are several caching libraries in the .NET ecosystem (FusionCache, CacheManager, Cache Tower, EasyCaching, LazyCache, MonkeyCache, and probably a bunch of others) which can handle the more complicated and feature rich scenarios.

That's a fair point :) If the .NET/ASP docs for caching can list these community packages, that would go a long way.

I am biased as the creator of one of those caching libraries but my view is that it is the simple/common scenario that the MemoryCache implementation should aim for - the ConcurrentDictionary with expiration scenario. It should be fast, it should be straight forward and a lot of people should use it, it just shouldn't be all-encompassing.

"straight forward" is an important qualifier :) Maybe the theoretical MemoryCache extension library could have GetOrAddLazy* alongside the current re-entrant factory version, to help discoverability in the IDE etc. Because that's the biggest current footgun for people IME. Cheers

ohroy commented 3 years ago

+1 current cache is hard to use, the cache stampede is very serious problem

davidfowl commented 3 years ago

current cache is hard to use, the cache stampede is very serious problem

Its possible to work around manually.

jodydonetti commented 3 years ago

current cache is hard to use, the cache stampede is very serious problem

If you'd like to avoid the cache stampede problem you can take a look at some alternatives (in alphabetical order):

Hope this helps.

ohroy commented 3 years ago

current cache is hard to use, the cache stampede is very serious problem

Its possible to work around manually.

sure, it can work around by lazy and lock etc, but not everyone can realize that it may have the problem of cache stampede , which will lead to serious consequences. whatever, it is not easy to use. I very hope to have a built-in solution , thank you for this proposal

molinch commented 2 years ago

current cache is hard to use, the cache stampede is very serious problem

Its possible to work around manually.

I know it's a fairly broad question where implementation may depend on uses cases, but what solution would you suggest when the cache key is dynamic ? (for example when you want to cache fetched user permissions so cache key could be user-17, user-18, ...). Having a SemaphoreSlim per key seems complicated and in the long run implies a memory leak. Relying on Lazy<T> semantics may work too depending on how the current MemoryCache is implemented.

jodydonetti commented 2 years ago

current cache is hard to use, the cache stampede is very serious problem

Its possible to work around manually.

I know it's a fairly broad question where implementation may depend on uses cases, but what solution would you suggest when the cache key is dynamic ? (for example when you want to cache fetched user permissions so cache key could be user-17, user-18, ...). Having a SemaphoreSlim per key seems complicated and in the long run implies a memory leak. Relying on Lazy<T> semantics may work too depending on how the current MemoryCache is implemented.

One of these https://github.com/dotnet/runtime/issues/48567#issuecomment-885756928 ?

molinch commented 2 years ago

Thanks @jodydonetti, since our need is simply an IMemoryCache without cache stampede issue, nothing more, nothing less. We went with @StephenCleary solution in the end: https://gist.github.com/StephenCleary/39a2cd0aa3c705a984a4dbbea8275fe9

I like this solution, it's a slim wrapper on top of IMemoryCache and you can easily follow the code.

robertbaumann commented 2 years ago

Consider something that can be extensible to support Azure Caching Guidance

davidfowl commented 10 months ago

There's a proposal for a new cache implementation here https://github.com/dotnet/extensions/issues/4766. Can those interested review it and leave comments?

jodydonetti commented 10 months ago

Thanks David, will do!

austinmfb commented 10 months ago

Interesting that there is no mention of managing cache dependencies in this thread. Not that it would have to be baked into a new cache class, but still seems like a relevant design consideration. Is nobody really using a consistent pattern for this that they want their cache class to handle? Is everyone just opting for inline code in each application component requiring this, and managing its specific set of dependent caches?

JoelDavidLang commented 10 months ago

ICacheEntry uses an unintuitive pattern for adding new entries

I was bitten by this just recently. I did not see any documentation in CreateEntry() nor on ICacheEntry that stated that the object needed to be disposed to commit it to the cache. As a result, the code I initially wrote didn't have working caching!

I would like to see this documentation clarified for the current version of the cache at least.