angelnikolov / ts-cacheable

Observable/Promise Cache Decorator
https://npmjs.com/package/ts-cacheable
ISC License
340 stars 42 forks source link

Feature: Return cache first. Then update cache via original method call. #51

Closed JarekToro closed 4 years ago

JarekToro commented 5 years ago

Like the title says. The cache is used to pre populate the observable. Then it still runs the original method and caches that data along with emitting the new values. Max Age could be used to dictate if it needs to update in the background or not.

angelnikolov commented 5 years ago

@JarekToro do you envision something like:

@Cacheable({
   data: [1,2,3]
});

Also, what do you mean update in the background or not?

JarekToro commented 5 years ago

No not exactly like that. Here is the use case. Let’s say you have a very slow api call. But it’s a call that you have to make at least once a users session. The data this api call returns doesn’t often change but it can.

So the first time you make the api call it gets the data and caches it. Now the next time maybe a few days later, the api is called again. First it returns the cached result. But it still wants to make the actual api call so it does that and when it returns we then update the cache and push the new data into the observable.

The observable would look something like this

// First call with no cache 
—api_results——>
// second call first returns cache then api results.
—cache—-api_results—->
angelnikolov commented 5 years ago

I am not sure how that would work without intervening the business logic of our users. Normally, the observable will be completed immediately, after we have returned any data to the user, cached or not. For example, when we have no cache and the user calls http.get(), it returns an Observable which immediately gets completed, after it completes. The user would call the method, we would call the endpoint and cache the results. Next, the user would call the method once more and now we will immediately return the cached data, but also later return the real data again. This means that the Observable needs to not complete due to the first call and would require us to convert the Observable to a Subject (BehaviorSubject in our case) which supports such mechanism. However, what that means is that the underlying stream will never complete and we could cause issues to users who used it like this for example:

this.userService.getUsers().subscribe();

Normally, you would expect the subscribe block to only run once, however if we next on the subject again, when the real data arrives it will be called again. I am not sure if this will make sense to new adopters of the library and also it somehow inverses the connection between the application and the library since now, the library will be able to essentially decide when to call the application. I will think about it and maybe try it out as a configuration option, but I am still not sure about it.

JarekToro commented 5 years ago

I have a prototype working it, I will share at some point. -I am on my phone at the moment-. It involves creating a merge of a of(cacheData) and the original observable. That way a subject is not needed. I was also thinking that a maxAge could be used where if it’s within a certain time frame it won’t have to make the api call. And only if it’s pass a certain time will it also get the api results.

angelnikolov commented 5 years ago

@JarekToro Looking forward for the prototype :)

JarekToro commented 5 years ago

This is based around your lib but currently doesn't integrate with the other features you have implemented like cache buster etc..

Basically it combines an Observable created with the cache data. And merges that with the regular observable returned from the method. It adds a operator via pipe(tap(x => //setCache)) to the end of the observable from original old method to update the cache


export function StartWithCache() {
  return function(
    _target: Object,
    _propertyKey: string,
    propertyDescriptor: TypedPropertyDescriptor<ICacheable<Observable<any>>>
  ) {
    const cacheKeyPrefix = _target.constructor.name + '#' + _propertyKey;
    const oldMethod = propertyDescriptor.value;
    if (propertyDescriptor && propertyDescriptor.value) {
      (propertyDescriptor.value as any) = function(..._parameters: Array<any>) {
const cacheKey = `${cacheKeyPrefix}${JSON.stringify(_parameters)}`
        const cache = Cache.getC(cacheKey); // returns of(cache)
        const oldObs = oldMethod.call(this, ..._parameters) as Observable<any>;
        return merge(
          cache,
          oldObs.pipe(Cache.setC(cacheKey)) // this adds a tap() operator that saves the data to local storage.
        );
      };
    }

    return propertyDescriptor;
  };
}

class Cache {
static getC<T>(id: string) {
    const value =  //localstorageGet
    return value ? of(value) : EMPTY;
  }

  static setC<T>(id: string) {
    return tap((x: T) => //localstorageSet);
  }
}
angelnikolov commented 5 years ago

@JarekToro I am gonna need to experiment with this, because as I said - this could introduce issues for existing users. Imagine that people have functionality which expects that calling a cached method will only produce one result (i.e - the callback passed to subscribe will be called only once). I will think about it, thanks for taking the time to propose this :)

JarekToro commented 5 years ago

No problem. I agree with you on that. I was thinking an additional option to pass something like startWithCache: boolean so as to not interfere with existing implementation.

Thanks for looking into it. I appreciate this lib as it is already so thank you for that!

stale[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

kalpeshhpatel commented 1 year ago

Hi @angelnikolov and @JarekToro we are looking for something similar. Is there any chance we could have this support?

Or is there any way the custom storage strategy can be extended for such functionality?

Thanks for your time :)