optimizely / nuclear-js

Reactive Flux built with ImmutableJS data structures. Framework agnostic.
https://optimizely.github.io/nuclear-js/
MIT License
2.23k stars 142 forks source link

Combined caching strategy #212

Open scjackson opened 8 years ago

scjackson commented 8 years ago

@jordangarcia @loganlinn

This includes the changes for the cache abstraction Logan authored and the unwatch/config options I added. API changes include:

  1. Config option for cache. This expects a cache constructor that conforms to the interface specified in Logan's earlier work.
  2. Default to using an LRU cache with a max of 1000 items
  3. Config option for maxItemsToCache. If not specified, this will default to 1000. If a number > 0, the LRU cache will cache up to maxItemsToCache values. If non numeric or null, we will default to using a BasicCache with no item limit
  4. Config option for useCache that defaults to true. If false, no cache values will be set.
  5. Getter pseudo-constructor that can be used to add cache and cacheKey options for getters. If specified, cacheKey will be used as the getter cache key rather than a getter reference. Valid values for cache are ['always', 'default', 'never']. If 'always' or 'never' are specified, these options will be favored over the global useCache setting.
  6. When getters are unobserved, their cached values will be evicted. If getters are unobserved by handler, their corresponding getters caches will be evicted if there are no other handlers associated with the getter
loganlinn commented 8 years ago

A goal of the new caching abstraction (#211) was to leave Getter and Reactor as unaware of the caching features as possible. I believe there is a lot of value in keeping Nuclear's API footprint as small as possible, while exposing powerful APIs that can apply to many use cases. There's less code to maintain and fewer concepts for newcomers to learn.

I imagined that these caching options to always/never cache globally or at Getter level would be implemented within a Cache instance as opposed to being directly integrated. For example, you could create a class that holds a mapping of Getters to their custom cache settings, like

var reactor = new Reactor({
  cache: new GranularCache({
    cache: "always",
    getters: Immutable.Map([
      [someGetter, "never"],  // using [k, v] tuple constructor because key is object
    ]),
  }),
})

Inside GranularCache, you would wrap a BasicCache (or even pass that in if you want it to compose) then lookup the Getter-specific settings in miss() to act accordingly.

I think the main changes that are here that we should keep are the caching improvements around removing observers.

How does that sound?

scjackson commented 8 years ago

@loganlinn @jordangarcia up to you guys what you want to do with this. it'd be a shame to not get this in prod soon as resolves a good chunk of memory leaks we see on the optimizely account