redis / ioredis

🚀 A robust, performance-focused, and full-featured Redis client for Node.js.
MIT License
14.31k stars 1.19k forks source link

Seeing a growing time delay when querying relatively large strings from redis. #1840

Open abriejenny opened 10 months ago

abriejenny commented 10 months ago

I am getting a value from redis that is around 100kb a significant amount of times (due to usage, in production). What im seeing is that every subsequent get takes slightly longer than the previous get.

I am trying to understand what the cause of this is, so that I can make a decision on how to tackle the problem.

I have a script to replicate what im seeing at the bottom. Results for getting the value 200 times look as follows:

0: time: 43ms
1: time: 45ms
2: time: 50ms
...
199: time: 234ms

You can see it go to 6x in just 200 gets. I even find the first latency surprisingly long, 43ms, I am running the script on the redis host, so no latency.

Script (need to update redis connection details of course):

import Redis from "ioredis";

const con = await new Redis({
  port: 6379,
  host: "somehost",
  family: 4,
  password:
    "somepassword",
  db: 5,
});

con.on("ready", function () {
  console.log("REDIS is ready for action");
});

// sleep for 1 sec to be sure redis is ready for action
await new Promise((resolve) => setTimeout(resolve, 1000));

// store a 100k value in redis
const key = "some:test:key";
const value = "a".repeat(100000);
await con.set(key, value);

// get the value asynchronously 200 times, log the time it took
for (let i = 0; i < 200; i++) {
  let t1 = Date.now();
  con.get(key).then((value) => {
    let t2 = Date.now();
    console.log(`${i}: time: ${t2 - t1}ms`);
  });
}
ag-TJNII commented 8 months ago

I believe you have a flaw in your test.

// get the value asynchronously 200 times, log the time it took
for (let i = 0; i < 200; i++) {
  let t1 = Date.now();
  con.get(key).then((value) => {
    let t2 = Date.now();
    console.log(`${i}: time: ${t2 - t1}ms`);
  });
}

If I'm understanding this properly each iteration of this loop is going to set t1 and create a promise to resolve con.get(key). When each promise resolves it generates t2 and compares it to t1. However t1 isn't set to when the get actually starts, it's set to whenever the last promise was created. So the output will reflect the time the get took, plus however long the promise was queued up from when the last loop iteration ran.

To clarify further, the numbers I'm seeing are almost the same as this test code:

  // get the value asynchronously 200 times, log the time it took
  let t1 = Date.now();
  for (let i = 0; i < 200; i++) {
    con.get(key).then((value) => {
      let t2 = Date.now();
      console.log(`${i}: time: ${t2 - t1}ms`);
    });
  }

However, if I modify it as such:

  // get the value asynchronously 200 times, log the time it took
  for (let i = 0; i < 200; i++) {
    let t1 = Date.now();
    const resp = await con.get(key)
    let t2 = Date.now();
    console.log(`${i}: time: ${t2 - t1}ms`);
  }

All the responses are <= 1ms.

Redis connections, as in the actual TCP socket connection to the Redis server, are single thread. So it makes sense that the 200 read promises will not all resolve at the same time. To show that the connection is blocking we can gum up the works with blpop:

  // store a 100k value in redis
  const key = "some:test:key";
  const value = "a".repeat(100000);
  await con.set(key, value);

  // Block the connection
  const con_blocker = con.blpop('dummy_key', 0);

  // get the value asynchronously 200 times, log the time it took
  let t1 = Date.now();
  for (let i = 0; i < 200; i++) {
    con.get(key).then((value) => {
      let t2 = Date.now();
      console.log(`${i}: time: ${t2 - t1}ms`);
    });
  }

This code only returns that the connection is ready. Based on my understanding of the Redis protocol and JS Promises I believe the behavior you're seeing for the code you posted is expected.