Open abritinthebay opened 8 years ago
+1 It would be nice and possibly useful also for non-deterministic functions.
You can just do this in your own cache handler, no need to extend fast-memoize for this.
Combining it with 2 other packages you can actually set both the max number of items in the cache and a TTL without the need to change anything on 'fast-memoize'.
import memoize from 'fast-memoize'
import Keyv from 'keyv'
import Cache from 'map-expire/MapExpire'
// We'll create a custom cache adapter
const storageAdapter = new Keyv({
ttl: 1000, // expires the entry after 1s
store: new Cache([], {
capacity: 1000,
// duration: 1000, // in millisecond, default expiration time. Not need to set it if ttl is already set
}),
})
// wrap the storage in a type of cache fast-memoize expects
const cache = {
create: function create() {
return storageAdapter
},
}
// from here it works as usual
const fn = function (one, two, three) { /* ... */ }
const memoized = memoize(fn, {
cache,
})
Update: it seems that I was wrong and that the above works only with my particular setup. Indeed I have a local fork of the package that I made async/await to match my needs. Now that I cloned the package and try the tests I see that they're not working for my setup unless I make it async/await and await the memoized functions in my code.
So it's better to dismiss my previous suggestion. If I find the time I may investigate more deeply how to make the test works. In the meantime if someone is curious these are the lines I changed in the package:
async function monadic (fn, cache, serializer, arg) { // here
var cacheKey = isPrimitive(arg) ? arg : serializer(arg)
var computedValue = await cache.get(cacheKey) // here
if (typeof computedValue === 'undefined') {
computedValue = await fn.call(this, arg) // here
cache.set(cacheKey, computedValue)
}
return computedValue
}
async function variadic (fn, cache, serializer) { // here
var args = Array.prototype.slice.call(arguments, 3)
var cacheKey = serializer(args)
var computedValue = await cache.get(cacheKey) // here
if (typeof computedValue === 'undefined') {
computedValue = await fn.apply(this, args) // here
cache.set(cacheKey, computedValue)
}
return computedValue
}
This is how I call my memoized functions
const fn = function (one, two, three) { /* ... */ }
const memoized = memoize(fn, {
cache,
})
const asyncFunc = () => {
const result = await memoized(one, two, three)
console.log({ result })
return result
}
Update 2: Here there is a working fork that handles both standard and async cache as well as async functions https://github.com/ecerroni/fast-memoize.js
@ecerroni could you explain how to use TTL in your fork? I looked at the code and don't see any handling of TTL with options ... Thanks
@leefsmp
yarn add ecerroni/fast-memoize.js
// import
const memoize = require('fast-memoize')
const Keyv = require('keyv')
const Cache = require('map-expire/MapExpire')
// create a custom cache adapter
const storageAdapter = new Keyv({
ttl: 1000, // in milliseconds, expires the entry after 1s
store: new Cache([], {
capacity: 1000,
// duration: 1000, // in millisecond, default expiration time. Not need to set it if ttl is already set
}),
})
// wrap the storage in a type of cache fast-memoize expects
const cache = {
create: function create() {
return storageAdapter
},
}
(async => {
// function
let myFn = () => { ... }
myFn = memoize(myFn, { cache })
const memoizedFn = myFn
await memoizedFn() // cached fn will be removed after 1 second
})()
That way cache entries will automatically expire and not explode memory usage. This is especially important on the server.
This would likely involve you keeping a separate "shadow cache" that kept the created times of the cacheKeys.
Obviously this would be fractionally slower so a simple opt-in (by say
{TTL: [any non-zero value in ms]}
in the options) to this code path is probably all you need.If you're interested I can submit a PR ;)