Closed tzapzoor closed 2 years ago
Could you write a simple (JS not TS) script to reproduce?
const level = require('level-rocksdb');
let db = level('./database', { valueEncoding: 'json' });
const getKeys = () => {
let keys = [];
for (let i = 0; i < 2000; i++) {
keys.push('a');
}
return keys;
};
(async () => {
await db.open();
while (true) {
let keys = getKeys();
let values = await db.getMany(keys);
console.log(values.length);
}
})();
This can be run on an empty database as well. It fills 5GB in 2 minutes and growing.
Thanks, looking into it. Also reproduces on other implementations (that share code):
const { ClassicLevel } = require('classic-level')
const db = new ClassicLevel('./db/' + Date.now())
const keys = new Array(2000).fill(0).map((_, i) => String(i))
;(async () => {
while (true) {
await db.getMany(keys)
console.log(process.memoryUsage().rss / 1024 / 1024)
}
})()
Fixed in 5.2.1. Thanks for the report - and your Open Collective donation ❤️
Context: OS: macOS 12.2.1 Chip: Apple M1 Pro Node: v16.13.2 rocksdb: 5.2.0 level-rocksdb: 5.0.0
Db options:
I stress tested the
db.GetMany
method with batches of 500 keys as part of a simple node.js express app and after some time the memory usage of process was about 40GB and growing.I then replaced
getMany
withget
and the memory stabilised at about 300MB, with worse performance, of course.The problem reproduces 100%. Do you guys have ideas why this could happen?
cc: @vweevers pinging as I saw you added support for
getMany
.Thank you!