Hello,
I have a data set that can be anywhere from 10k to 500k records.
The code I am using to do this:
const multi = this.redisClient.pipeline()
Object.keys(tiers).forEach((tierId) => {
const tierList = tiers[tierId]
tierList.forEach((score, index) => {
multi.zadd(this.nsRank(tierId), index + 1, score.conKey)
multi.hset(this.nsConstituents(), score.conKey, JSON.stringify(scrubScoreObject(score)))
})
})
console.log('does it get past this')
await multi.exec()
tierList can be pretty large.
Currently on my local machine with local redis it throws
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
for about 200k records being entered
I was using node-redis and it seemed quite a bit faster for the same data.
with ioredis this completes in about 20seconds when using clustered elasticache.
The funny thing is that it doesn't even reach await multi.exec() is this problem with the loop?
Hello, I have a data set that can be anywhere from 10k to 500k records. The code I am using to do this:
tierList
can be pretty large.Currently on my local machine with local redis it throws
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
for about 200k records being enteredI was using node-redis and it seemed quite a bit faster for the same data.
with ioredis this completes in about 20seconds when using clustered elasticache.
The funny thing is that it doesn't even reach
await multi.exec()
is this problem with the loop?