isaacs / node-lru-cache

A fast cache that automatically deletes the least recently used items
http://isaacs.github.io/node-lru-cache/
ISC License
5.35k stars 353 forks source link

Distinguish ttl removed items from manually deleted items #330

Closed Tipwheal closed 4 months ago

Tipwheal commented 8 months ago

Hello, I found that there're dispose and disposeAfter options in constructor.

In my case, I want to distinguish these two behaviors:

When an item is manually deleted by the user, I just turn the control to the user to decide when to destroy. But when the item is evicted beaucse of max size settings or ttl settings, I need to automatically destroy them in dispose or disposeAfter hooks. And I found that in both ttl deleted case and manually deleted case, the reasons are delete. So I found it hard to implement my logics.

Is there any solution to this?

isaacs commented 4 months ago

I'll add a few more entries to the DisposeReason option, so that this is clearer.

  /**
   * The reason why an item was removed from the cache, passed
   * to the {@link Disposer} methods.
   *
   * - `evict`: The item was evicted because it is the least recently used,
   *   and the cache is full.
   * - `set`: A new value was set, overwriting the old value being disposed.
   * - `delete`: The item was explicitly deleted, either by calling
   *   {@link LRUCache#delete}, {@link LRUCache#clear}, or
   *   {@link LRUCache#set} with an undefined value.
   * - `expire`: The item was removed due to exceeding its TTL.
   * - `fetch`: A {@link OptionsBase#fetchMethod} operation returned
   *   `undefined` or was aborted, causing the item to be deleted.
   */
  export type DisposeReason = 'evict' | 'set' | 'delete' | 'expire' | 'fetch'

I'll publish the new version shortly.

You could probably also use the status object, though it'd be convoluted and flaky. You'd need to basically put a status object on every fetch(), get() and set() call, and then the dispose method would have to work out which operation caused it to be removed, which is going to be painful if you have lots of async fetch() calls with different latencies all happening in parallel.