medikoo / memoizee

Complete memoize/cache solution for JavaScript
ISC License
1.73k stars 61 forks source link

Memoize options hash argument by identity #110

Closed richardscarrott closed 4 years ago

richardscarrott commented 4 years ago

Given a function which accepts an options hash, e.g.

const fn = (options) => '...';

const memoizedFn = memoize(fn, {
  normalizer: (args) => {
      // I can't JSON.stringify() here as `new Fib()` becomes `{}` whereas ideally each of the options values would be compared by identity; as if they were not an options object and instead `fn(foo, bar)`
     return Object.values(args[0]);
  }
});

const fib = new Fib();

memoizedFn({
  foo: 'bar',
  bar: fib
});

memoizedFn({
  foo: 'bar',
  bar: fib
}); // Expect: cache hit

memoizedFn({
  foo: 'bar',
  bar: new Fib()
}); // Expect: cache miss

Is this possible?

medikoo commented 4 years ago

@richardscarrott it is possible, but it's quite specific case, to which you need to prepare a normalizer manually.

Problem is that you need to identify Fib instances by primitive, and if you don't have straightforward means to do that, you may help yourself with other memoizer as e.g.:

const memoizeWeak = require("memoizee/weak");
const lastFibId = 0;
const getFibId = memoizeWeak(fib => ++lastFibId);
...

const memoizedFn = memoize(fn, {
  normalizer: ([options]) => JSON.stringify({...options, bar: getFibId(options.bar)})
});
richardscarrott commented 4 years ago

I see, that is quite specific. I wonder if it would be possible to expose a getCacheKey option, allowing call sites to return an array, e.g.

interface Options {
  foo: string;
  bar?: Fib;
}

const fn = (options: Options) => '...';

const memoizedFn = memoize(fn, {
  getCacheKey: (options) => {
    // This is based on an assumption that cache keys can be arrays representing each argument passed in and each index is checked by identity?
    return Object.entries(options).sort(([a], [b]) => {
      if (a < b) return -1;
      if (a > b) return 1;
      return 0;
    }).map(([_, val]) => val);
  }
});

...or could normalizer return an array which would be treated like this?

medikoo commented 4 years ago

@richardscarrott sorry, I don't understand suggestion.

To be able to cache invocation we need some primitive (string) identification key, which is resolved out of arguments, normalizer serves exactly this purpose.

richardscarrott commented 4 years ago

@medikoo If all cache keys are strings, how does the following work:

const fn = (a, b) => '...';

const memoizedFn = memoize(fn);

const fib = new Fib();
memoizedFn('foo', fib); // miss
memoizedFn('foo', fib); // hit
memoizedFn('foo', new Fib()); // miss (different instance of Fib)

Does the cache not accept an array of values which are each checked by identity?

medikoo commented 4 years ago

how does the following work:

Under the hood, string id is generated for given arguments. By default different instance is treated as different value, so it works as you point.

Still you may provide your own cache id generator (via normalizer option), where you can recognize different instances as same value and produce same cache id accordingly

richardscarrott commented 4 years ago

I see, so sounds like the default normalizer works in a similar way to your first solution to my options hash issue where each instance get's a unique id. Thanks for the insight.