Open revskill10 opened 1 year ago
Hi, I agree that some sort of LRU caching would be a nice addition.
I'm not sure I understand what you mean about setting the cache directory size.
Are you suggesting using the lru-cache NPM module or something else?
@mistval I'm using edge function
on Vercel and Cloudflare worker, so the goal is to use their cache api to implement basic LRU cache with Web API fetch
. Do you have any idea on this ? Sorry i have not much experience on edge/web api computing now.
For the cache directory size
, the idea is to automatically purge
the lru cache once data is larger than threshold. For example on Vercel
, default size for /tmp
size is 500MB. In this case, i want 5MB
instead.
Hmm I am not familiar with those technologies. The cache module used here (cacache
) also doesn't seem to have an LRU ejection mechanism built in. It's still a good suggestion but may be tricky.
Could you please options for LRU cache with maximum threshold size options ? Normally i set the cache directory with a maximum size.
Auto eviction is built-in with LRU , too i think.