isaacs / node-lru-cache

A fast cache that automatically deletes the least recently used items
http://isaacs.github.io/node-lru-cache/
ISC License
5.38k stars 353 forks source link

Explore writing some/most of this lib in C, and compiling to WASM #205

Closed isaacs closed 1 year ago

isaacs commented 2 years ago

Not sure if that'd be faster, but it's worth exploring.

eugene1g commented 2 years ago

IIRC, there is no reliable way to share memory with a WASM library, so to avoid having to clone all objects in the cache you'll need to change the architecture to still keep the main objects in JS, and track a pointer to these objects in WASM for TTL/size/eviction calculations. If true, I'd expect this translation between JS and WASM to have a significant negative impact on performance.

isaacs commented 2 years ago

Probably the thing to explore is just keeping the next/prev lists and head/tail indexes in the wasm memory block, and use a C implementation for just the linked list management. The keyMap, valList, keyList need to be in js land. Ttls, starts, and sizes could be in wasm memory as well, since they're also just integers, but we don't limit size on those, so might be safer in js.

isaacs commented 1 year ago

Going to close this. It's a neat idea, but WASM is kind of a pain, and I haven't gotten around to digging into doing this a year after posting this issue. If I ever do, this library will probably be the first one I try to tackle. For now, it's fast enough lol

isaacs commented 1 year ago

Plus, having to wait for an async promise hit on startup would be kind of annoying.