Closed EvanK closed 5 years ago
I think this is just beyond the scope of this module. This will work better for you if you just write your own cache wrapper around your exact needs.
I'll let @kanongil decide if he wants to reconsider.
This thread has been automatically locked due to inactivity. Please open a new issue for related bugs or questions following the new issue template instructions.
My rationale
I'm doing a ton of cache operations where I have a list of values ("id"s in the parlance of the policy api) for which I'm generating values with a database query.
Design-wise, I have two clear options:
Proposed addition to API
It would be useful to support generating and getting/setting in cache, batches of items at a time. The underlying strategies could optimize for as little network overhead as possible -- eg, catbox-redis could use
MGET
/MSET
and/or pipelining.I know that operating with a batch of ids and a superset of cache values adds an order of magnitude in complexity, but you could foist that complexity onto the user of the lib by accepting a collection of functions in place of
generateFunc
:batchFunc
given a list of ids that missed cache (or have become stale?), generate a superset of values (for all the given ids)filterFunc
given that superset of all newly generated values and an id, filter out values not relevant for the given id (to then set in cache under the given id)mergeFunc
given multiple values (some from cache, some freshly generated), combine them somehow for catbox to return as a batchSo the proposed batching API would be something like this: