Closed TannerS closed 1 year ago
This class does not enforce a singleton, no, that would be a bad idea, since you might need to cache multiple different sorts of things!
But modules in node are singletons.
So, for example, this is probably what you want to do:
// in src/routes/api.ts or something
import LRUCache from 'lru-cache'
// only create one fooCache for this _module_, which is a singleton
const fooCache = new LRUCache<string, Foo>({
max: 10000,
ttl: 1000 * 60 * 60, // 1hr
fetchMethod: async (key) => await getNewFoo(key),
})
export const getFoo = async (params) => {
const foo = await fooCache.fetch(params.foo).catch(er => {
console.error('foo error', params, er)
return new Response(JSON.stringify({ message: 'error fetching foo' }, 500))
})
return new Response(JSON.stringify(foo))
}
Then if you had another file that did something like this:
// src/routes/other.ts
import { getFoo } from './api'
import { otherThing } from './other-thing'
export const otherEndpoint = async (params) => {
if (params.foo) {
return getFoo(params)
} else {
return otherThing(params)
}
}
then it'll still always use the module-local fooCache
.
However, this would almost certainly not be what you want:
// in src/routes/api.ts or something
import LRUCache from 'lru-cache'
export const getFoo = async (params) => {
// oh no! We're not actually caching anything, because
// we throw the cache away and create a new one on
// each API request! That's no good!
const fooCache = new LRUCache<string, Foo>({
max: 10000,
ttl: 1000 * 60 * 60, // 1hr
fetchMethod: async (key) => await getNewFoo(key),
})
const foo = await fooCache.fetch(params.foo).catch(er => {
console.error('foo error', params, er)
return new Response(JSON.stringify({ message: 'error fetching foo' }, 500))
})
return new Response(JSON.stringify(foo))
}
Hope that helps.
Note that in some platforms' dev environments, the module cache is blown away on every code change, or even on every request, so you won't see any caching happening. One workaround for that is to put the cache on the global
object, and only recreate it if it's missing, so that you don't get it blown away on each update.
Thanks for the examples, I'll take a look when I get back to my desk. Basically, in this code based we are working with, it is a server that processes x different API calls. In each API call. It may go through many files that process the info..
What we are seeing is, there's endpoints that run through diff files (modules) that create a new instance of LRU cache using the new keyword, and then using it's setters and getters.
So imagine if you will, someone sends an API request, it reaches out server (running express), and each time an API calls is made, somewhere in that path of function calls it creates a new instance of LRU and uses it
Does that mean it is incorrect? Since after the code runs, the server instance and connection ends? So next API call the cache would be empty cause it just creates a new instance?
Get Outlook for Androidhttps://aka.ms/AAb9ysg
From: isaacs @.> Sent: Friday, January 20, 2023 6:57:46 PM To: isaacs/node-lru-cache @.> Cc: Tanner Summers @.>; Author @.> Subject: Re: [isaacs/node-lru-cache] [Question]: Does this follow a singleton type design? (Issue #261)
Note that in some platforms' dev environments, the module cache is blown away on every code change, or even on every request, so you won't see any caching happening. One workaround for that is to put the cache on the global object, and only recreate it if it's missing, so that you don't get it blown away on each update.
— Reply to this email directly, view it on GitHubhttps://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fisaacs%2Fnode-lru-cache%2Fissues%2F261%23issuecomment-1399108762&data=05%7C01%7C%7C49f6ca63fd234daba5a108dafb4a82d6%7C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C638098594682123026%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=t%2FbYEZAkc2R8xm9RrYENvO9rzBDVvZpPdZ4xIWkEQ5M%3D&reserved=0, or unsubscribehttps://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FACDUUD4KBTGVDL6E7VLN3VTWTMYAVANCNFSM6AAAAAAUCAQ6ZM&data=05%7C01%7C%7C49f6ca63fd234daba5a108dafb4a82d6%7C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C638098594682123026%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=68YS1BIYVn5IHPgHyPA8S6ebGhrwkIlSru3O8D6ZCgs%3D&reserved=0. You are receiving this because you authored the thread.Message ID: @.***>
@isaacs
Sorry I missed this comment.
Yes, you definitely need to be creating the cache in some long-lived fashion (ie, NOT in a request handler function or transient serverless handler), or else it's not making anything faster, but rather quite the opposite.
It's probably fine to create the cache lazily, so the first request is slower but repeated requests are faster, but the cache does need to live long enough to actually have cache hits, or it's just pure cost with no benefit.
So what i mean is, if I create an instance of LRU cache in a file, and that file gets called multiple times for multiple api request, is that instance re-created (basically re-creating the initial cache or will it remember it was created in prior api call and use the data from there?