oven-sh / bun

Incredibly fast JavaScript runtime, bundler, test runner, and package manager – all in one
https://bun.sh
Other
73.24k stars 2.69k forks source link

Bun is slower than Node.js for cache eviction in several lru packages by ~1800-3600% #14063

Open bompus opened 3 hours ago

bompus commented 3 hours ago

What version of Bun is running?

1.1.29+6d43b3662

What platform is your computer?

Ubuntu 24.04.1 LTS Linux 6.10.7-x64v3-xanmod1 x86_64 x86_64

What steps can reproduce the bug?

$ bun add mitata flru lru.min lru-cache mnemonist $ bun lru-cache-mitata.js && node lru-cache-mitata.js

// lru-cache-mitata.js
import { bench, run, compact, summary } from 'mitata';

import createFlru from 'flru';
import { createLRU as createLruMin } from 'lru.min';
import { LRUCache } from 'lru-cache';
import createMnemonistLruCache from 'mnemonist/lru-cache.js'; // .js extension required for Node.js?

const maxItems = 1000;
const evictItems = maxItems * 2.1;

const DATA = new Array(evictItems).fill(0).map((_, index) => {
  return [`key${index}`, { hello: Math.floor(Math.random() * 1e7) }];
});

const DATA2 = new Array(evictItems).fill(0).map((_, index) => {
  return [`key${index}`, { world: Math.floor(Math.random() * 1e7) }];
});

const flru = createFlru(maxItems);
const lruMin = createLruMin({ max: maxItems });
const lruCache = new LRUCache({ max: maxItems });
const mnemonistLru = new createMnemonistLruCache(maxItems);

// prime the cache with the maximum number of items it can hold
for (let i = 0; i < maxItems; i++) {
  flru.set(DATA[i][0], DATA[i][1]);
  lruMin.set(DATA[i][0], DATA[i][1]);
  lruCache.set(DATA[i][0], DATA[i][1]);
  mnemonistLru.set(DATA[i][0], DATA[i][1]);
}

// evict - sets 2x as many entries as `maxItems`, churning the entire cache twice

compact(() => {
  summary(() => {
    bench('flru evict', () => {
      for (let i = 0; i < evictItems; i++) {
        flru.set(DATA2[i][0], DATA2[i][1]);
      }
    });

    bench('lru.min evict', () => {
      for (let i = 0; i < evictItems; i++) {
        lruMin.set(DATA2[i][0], DATA2[i][1]);
      }
    });

    bench('lru-cache evict', () => {
      for (let i = 0; i < evictItems; i++) {
        lruCache.set(DATA2[i][0], DATA2[i][1]);
      }
    });

    bench('mnemonist/lru-cache evict', () => {
      for (let i = 0; i < evictItems; i++) {
        mnemonistLru.set(DATA2[i][0], DATA2[i][1]);
      }
    });
  });
});

await run();

What is the expected behavior?

I would expect Bun to be comparable or faster than Node.js

This could be an underlying JSC issue, but either way, it's an significant difference.

flru is much simpler, and the difference there is only 25%, but the other packages are roughly 1800-3600% slower in Bun for cache eviction, depending on what benchmarking library is used. I tested initially using tinybench, then with mitata to confirm.

What do you see instead?

$ bun --revision # 1.1.29+6d43b3662 $ bun lru-cache-mitata.js

image

$ node -v # v22.9.0 $ node lru-cache-mitata.js

image

Additional information

No response

RADHAsakthivel commented 3 hours ago

import createMnemonistLruCache from 'mnemonist/lru-cache.js'; Is createMnemonistLruCache is an normal LRU cache implementation ?? if yes can you please the implementation of mnemonist/lru-cache.js ??

bompus commented 3 hours ago

@RADHAsakthivel docs for mnemonist/lru-cache. I just had to add the .js extension to prevent Node.js throwing an error saying it can't find the import.