redis / ioredis

🚀 A robust, performance-focused, and full-featured Redis client for Node.js.
MIT License
14.34k stars 1.19k forks source link

Help with massive inserts #588

Closed pr1ntr closed 6 years ago

pr1ntr commented 6 years ago

Hello, I have a data set that can be anywhere from 10k to 500k records. The code I am using to do this:

const multi = this.redisClient.pipeline()

Object.keys(tiers).forEach((tierId) => {
            const tierList = tiers[tierId]
            tierList.forEach((score, index) => {
                multi.zadd(this.nsRank(tierId), index + 1, score.conKey)
                multi.hset(this.nsConstituents(), score.conKey, JSON.stringify(scrubScoreObject(score)))
            })
        })
console.log('does it get past this')
 await multi.exec()

tierList can be pretty large.

Currently on my local machine with local redis it throws FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory for about 200k records being entered

I was using node-redis and it seemed quite a bit faster for the same data.

with ioredis this completes in about 20seconds when using clustered elasticache.

The funny thing is that it doesn't even reach await multi.exec() is this problem with the loop?

pr1ntr commented 6 years ago

I made a mistake... windows doesn't give node too much memory it seems.