medikoo / memoizee

Complete memoize/cache solution for JavaScript
ISC License
1.73k stars 61 forks source link

v1.0 API #73

Open medikoo opened 7 years ago

medikoo commented 7 years ago

A live (updated in place) proposal for v1.0 API:


Signature of main memoizee function will remain same: memoizee(fn[, options])

Supported options:

contextMode, possible values:

resolutionMode, possible values:

serialize

length

Will work nearly exactly same as in current version. One difference would be that dynamic length intention will have to be indicated through -1 and not false

normalizers

Arguments normalizers, it's what's represented now by resolvers, otherwise it will work exactly same

ttl (previously maxAge)

Will represent same feature as in current version, with following changes, improvements:

Additionally:

max

Will work same way as now. Still performance of lru-queue will have to be revised, we should not drag behind lru-cache.

Additionally, in async case, the setting should be effect at the invocation, and not at the resolution as it's currently (see #131)

refCounter

Will work same way as it is now.

Memoize configuration objects

Each memoized function will expose memoization object, which will provide access to events and methods which will allow to access and operate on cache manually

It will be either instance of Memoizee (exposed on memoizedFn.memoizee property) or instance of MemoizeeFactory (exposed on memoizedFn.memoizeeFactory property).

Memoizee

It's instance will be exposed on memoizedFn.memoizee when memoization is configured with 'function' contextMode (that's the default).

Methods

Events

MemoizeeFactory

Its instance will be exposed on memoizedFn.memoizeeFactory when memoization is configured with 'weak' or 'method' contextMode.

It will produce different Memoizee instances, e.g. in case of 'method' for each different context different Memoizee instance will be created. Same with 'weak' contextMode, when length > 1. In case of length === 1 and 'weak' there'll either be other dedicated class, or MemoizeeFactory instance will not produce any Memoizee instances (it will be just handling context objects).

Methods

In below methods value can be a: memoized method (in case of 'method'), a Memoizee instance (in case of 'weak' with length > 1), or cached value (in case of 'weak' and length === 1)

There will be no mean to iterate over all contexts for which values have been resolved, as we will not keep handles to processed contexts in a factory (it's to avoid blocking them from gc).

Events


This is just rough proposal, it's important that performance is at least maintained and at best improved (where possible). Therefore some deviations from above are possible.

It might be good to also consider:

Rush commented 7 years ago

maxAge (or ttl, to be decided)

maxAge has always been confusing to me. ttl would be vastly more intuitive (or timeToLive)

Rush commented 7 years ago

'promise' target of memoization is an asynchronous function that returns promise. The eventual promise mode handling variants could be forced via promise:then, promise:done. If possible, it might be good to autodetect an ES2017 async functions and make it a default mode for those

Is it possible to make this configurable? I don't want memoizee to support bluebird out of the box but it would be cool if we could hook in a custom way to work with promises.

medikoo commented 7 years ago

Is it possible to make this configurable? I don't want memoizee to support bluebird out of the box

What exactly do you mean? promise mode won't be forced in any way, and support will be generic (no matter what promise library you're using - if it's compliant it will just work. It actually already works that way, with small difference that sometimes you need to do promise:then instead of just promise to avoid issues in some implementations.

Rush commented 7 years ago

@medikoo isn't this a good reason to provide custom way to introspect promises?

medikoo commented 7 years ago

@medikoo isn't this a good reason to provide custom way to introspect promises?

I think then, done and finally are pretty standard, and internal handling of those should allow to solve it well for any promise library (without sharing other custom ways)

fazouane-marouane commented 5 years ago

@medikoo would you need some help on this? I can particularly help a bit on the LRU part. I noticed the use of objects and loops while Maps and linked list may be better suited for the task.

medikoo commented 5 years ago

@fazouane-marouane great thanks for that initiative! However situation is a bit difficult as this new version is about complete rewrite

Anyway you mentioned you can help a bit on LRU part. Currently our LRU handling depends on lru-queue package, which as I tested it (a longer while ago), was not as efficient as e.g other popular library lru-cache. I don't remember coining why exactly we are dragging behind. If you can diagnose it, and eventually (if that's applicable) propose some optimizations to lru-queue that will bring us on desired level, that will be something, and will mark that task as solved.

fazouane-marouane commented 5 years ago

@medikoo a quick update on the subject, we'll have soon a speedup of x18 on integer keys and between 2.5x and 5x on string keys. It'll be a dropin replacement for the current lru-queue implementation. I'll propose a pullrequest soon. I'll test https://www.npmjs.com/package/hashtable first, as I suspect it'll bring even more speedup for string keys.

joanned commented 5 years ago

@medikoo Will the recoverySpan option be available? Basically, we want some safe guard to return previously cached data in case our promise fails, in case the API we're calling has an outage of some sort. Is there any way to do this currently?

medikoo commented 5 years ago

@joanned Unfortunately it's not available now, and it's hard to sneak it into current version, due to its design limitations. It's scheduled for v1

andymac4182 commented 1 year ago

Is this not going ahead? Seems like this project is complete and not planning on this new V1 direction

medikoo commented 1 year ago

@andymac4182 It's still in plans, but as otherwise I have a full time job, and other things have higher priorities I simply do not find time to handle that.

It's possible that later this year I will have some time for that, but again there's nothing certain :)

andymac4182 commented 1 year ago

That is good to hear :) I totally understand the full time job part. I have the same time constraints :)

Is it worth spinning up a V1 branch and people can contribute to the direction of V1 so you don't need to do it all?

medikoo commented 1 year ago

Is it worth spinning up a V1 branch and people can contribute to the direction of V1 so you don't need to do it all?

@andymac4182 it's a good question. I see v1 as complete rewrite, so it's hard to just set an empty branch and tell users to continue. Also I have some ideas on tackling that, so was thinking at least starting this work and then eventually let others to follow up with proper guidance.

andymac4182 commented 1 year ago

No worries :) Happy to help where possible.