alastairtree / LazyCache

An easy to use thread safe in-memory caching service with a simple developer friendly API for c#
https://nuget.org/packages/LazyCache
MIT License
1.72k stars 159 forks source link

Support for IDistributedCache #59

Open alexdobarganes opened 5 years ago

alexdobarganes commented 5 years ago

[To support this feature request please add a thumbs up]

It would be good to extend this current project to support IDistributedCache!

alexdobarganes commented 5 years ago

I was thinking on extending te current ICacheProvider interface to something like this.


using System;
using System.Threading.Tasks;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;

namespace LazyCache.Providers
{
    public class DistributedCacheProvider: ICacheProvider
    {
        private readonly IDistributedCache _cache;

        public DistributedCacheProvider(IDistributedCache cache)
        {
            _cache = cache;
        }
        public void Dispose()
        {
            throw new NotImplementedException();
        }

        public void Set(string key, object item, MemoryCacheEntryOptions policy)
        {
            throw new NotImplementedException();
        }

        public object Get(string key)
        {
            throw new NotImplementedException();
        }

        public object GetOrCreate<T>(string key, Func<ICacheEntry, T> func)
        {
            throw new NotImplementedException();
        }

        public void Remove(string key)
        {
            throw new NotImplementedException();
        }

        public async Task<T> GetOrCreateAsync<T>(string key, Func<ICacheEntry, Task<T>> func)
        {
            throw new NotImplementedException();
        }
    }
}
alastairtree commented 5 years ago

Yeah something like that would be the way to go. The challenge is that you need to come up with a way to handle the binary serialisation required by IDistributedCache (I suggest BSON as a default, but using another provider model for the serialiser to allow anyone to swap out the binary serialiser if needed) and you need to solve the fact that MemoryCacheEntryOptions dont make sense for IDistributedCache, instead you need DistributedCacheEntryOptions

alexdobarganes commented 5 years ago

Yes, I agree. I thik that part sholdnt be a big deal. What is stopping me currently is the fact what is being stored is the Lazy "Func<ICacheEntry, T> factory" I'm kinda confused here is what im doing! So when i run this code my result is the func not the result from the factory method!


public object GetOrCreate<T>(string key, Func<ICacheEntry, T> factory)
            {
                if (!cache.TryGetValue(key, out object result))
                {
                    //TODO i guess at this point i need to create a CacheEntry and pass it to the factory like factory(entry)
                    result = factory(null);

                    //Im just trying to set some policy here
                    Set(key, result, new MemoryCacheEntryOptions
                    {
                        SlidingExpiration = TimeSpan.FromSeconds(20)
                    });
                }

                return (T)result;
            }
alexdobarganes commented 5 years ago

I guess I went to fast thru this part that indeed is the most critical lol. I have implement a total new IDistributedAppCache very similar but when it comes to deserializing the Lazy Task it breaks need to dig more into this.

On Wed, Mar 20, 2019 at 8:36 PM Alastair Crabtree notifications@github.com wrote:

Yeah something like that would be the way to go. The challenge is that you need to come up with a way to handle the binary serialisation required by IDistributedCache (I suggest BSON as a default, but using another provider model for the serialiser to allow anyone to swap out the binary serialiser if needed) and you need to solve the fact that MemoryCacheEntryOptions dont make sense for IDistributedCache, instead you need DistributedCacheEntryOptions https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.caching.distributed.distributedcacheentryoptions?view=aspnetcore-2.2

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/alastairtree/LazyCache/issues/59#issuecomment-475079451, or mute the thread https://github.com/notifications/unsubscribe-auth/AJma_CsfyvuYLcHn3RlTNmWSdMdl_D8_ks5vYtQTgaJpZM4b_1xw .

alastairtree commented 5 years ago

I don't think there is any point in putting the lazy into the distributed cache, because it will be a serialised version not the original one with references to real databases and connections and stuff in the factory. Better to wait till the lazy is evaluated and cache the result in the distributed cache I think.

alexdobarganes commented 5 years ago

Yes, I was originally inclined to do that. Probably my fatigue yesterday blind my mind!

alexdobarganes commented 5 years ago

I think i got it! I'll do a pull request so you can take a look!

alexdobarganes commented 5 years ago

How can I share this code with you i tried to push a new branch but not success could you create a new branch with open access?

alastairtree commented 5 years ago

You need to fork the repo and push to your fork, then you can open a PR from your fork branch into here

alexdobarganes commented 5 years ago

Thanks

alexdobarganes commented 5 years ago

Do you mind taking a look into it https://github.com/alexdobarganes/LazyCache/tree/distributed-cache

alastairtree commented 5 years ago

I created a PR for your work @alexdobarganes at #63

miguelcrpinto commented 4 years ago

any news on this? the last comments where done more than an year ago

tvblomberg commented 3 years ago

Any news on this?

alastairtree commented 3 years ago

No, have not had a need for it so have not put any time into it, and the PR seems to have stalled.