Open lukasbindreiter opened 7 months ago
That's an interesting idea, but I'm not sure how big the scope of this issue would be, because:
SuppressedLoader
, that would handle per-key locking? At the moment, the latter would make more sense. I like the suggestion of the GetContext
api in the issue you linked, I think it could make a lot of sense here as well.
GetContext(context.Context, key K, opts... Options) (Item, error)
This way we can pass in a context (for cancellation) and we can also get back potential errors by the loader.
If I adapt my loader a little bit to something like the LoaderContext
proposed in #120 it could look like this, right?
websiteLoader := ttlcache.LoaderContext[string, string](
func(c *ttlcache.Cache[string, string], ctx context.Context, key string) (*ttlcache.Item[string, string], error) {
// load website content
content, err := ... // make http request to fetch url=key
if err != nil {
return nil, err
}
item := c.Set(key, content)
return item
},
)
But this is now actually a separate from this issue (lock-per-key) but that could become a separate option then, e.g.
websiteCache := ttlcache.New[string, string](
ttlcache.WithLoaderContext[string, string](websiteLoader),
ttlcache.WithLockAndLoad(), // a new option
)
Then the usage could look like this:
ctx := context.Background()
// goroutine1
content, err := websiteCache.GetContext(ctx, "https://www.some.site")
// goroutine2
content, err := websiteCache.GetContext(ctx, "https://www.other.site")
// goroutine3
content, err := websiteCache.GetContext(ctx, "https://www.some.site")
Alternatively, instead of a new option such as ttlcache.WithLockAndLoad
a wrapped loader could also be used of course.
What do you think, would an API like this make sense? (It would at the very least solve our exact use-case at hand 😄 )
Hi,
I was skimming through the code a little bit, trying to figure out if there is support for a use case for locking just a specific
key
in the cache while a custom loader (thats slow, like e.g. making an HTTP request) is running.For example, imaging something like this:
Now imagine I have three goroutines called at the exact same time:
Is there a way to add a lock-per-cache-key, e.g. to make
goroutine3
block whilegoroutine1
is still running, but letgoroutine2
continue as normal? From what I saw in the code, I think not, right?There is a
lockAndLoad
parameter here, but if I understood this correctly it locks the whole cache, so in the above examplegoroutine2
would block whilegoroutine1
is running, right?Do you think this would be a feature worth adding? I was thinking about implementing a wrapper around ttlcache that supports something like that, but then I thought this could be a potential useful feature in general?
Of course one thing about this is that error handling is maybe a bit unclear then, so I also see an argument that this is already too much magic happening by the cache, instead of letting the user implement it, but I would be interested in opinions in any case.