StackExchange / StackExchange.Redis

General purpose redis client
https://stackexchange.github.io/StackExchange.Redis/
Other
5.84k stars 1.5k forks source link

Timeout performing SCAN (10000ms), next: SCAN #2662

Open obrunomota opened 4 months ago

obrunomota commented 4 months ago

I'm trying to remove some keys by prefix.

How am I doing:

public override async Task RemoveByPrefixAsync(string prefix, params object[] prefixParameters)

{

  prefix = PrepareKeyPrefix(prefix, prefixParameters);

  foreach (var endPoint in _connectionWrapper.GetEndPoints())
  {
      var keys = GetKeys(endPoint, prefix);

      _db.KeyDelete(keys.ToArray());
  }
  await RemoveByPrefixInstanceDataAsync(prefix);

}

and here the error I'm taking: Timeout performing SCAN (10000ms), inst: 101, qu: 0, qs: 0, aw: False, rs: ReadAsync, ws: Idle, in: 0, in-pipe: 0, out-pipe: 0, serverEndpoint: xx.xx.x.xx:xxx, mc: 1/1/0, mgr: 10 of 10 available, clientName: api, IOCP: (Busy=0,Free=1000,Min=12,Max=1000), WORKER: (Busy=3,Free=32764,Min=12,Max=32767), v: 2.2.88.56325 (Please take a look at this article for some common client-side issues that can cause timeouts: https://stackexchange.github.io/StackExchange.Redis/Timeouts)", "detalhes": "InnerException: ---- StackTrace: at StackExchange.Redis.CursorEnumerable1.Enumerator.ThrowTimeout(Message message) in /_/src/StackExchange.Redis/CursorEnumerable.cs:line 244\n at StackExchange.Redis.CursorEnumerable1.Enumerator.SlowNextSync() in /_/src/StackExchange.Redis/CursorEnumerable.cs:line 191\n at System.Collections.Generic.LargeArrayBuilder1.AddRange(IEnumerable1 items)\n at System.Collections.Generic.EnumerableHelpers.ToArray[T](IEnumerable`1 source)\n

Has anyone experienced this and if so how did you resolve it? Any idea?

Thanks!

mgravell commented 4 months ago

The first thing I'd check here is SLOWLOG on the server for a long-running operation, to see whether anything server side contributed. SCAN by itself shouldn't have huge cost (that's the entire point of SCAN over KEYS)

As a side note: rather than using blind ToArray, you might prefer to use the LINQ Chunk method to get reasonably sized delete pieces, I.e.

foreach (var keys in GetKeys(endPoint, prefix).Chunk(batchSize))
{
    _db.KeyDelete(keys);
}

I'm also assuming that GetKeys here is returning an open IEnumerable<RedisKey> rather than a closed array such as RedisKey[] (or list, etc), but I can't see GetKeys here.

obrunomota commented 4 months ago

@mgravell Thank you so much for your answer.

As for the suggestions you made, I will make some considerations:

So the only alternative would be to check SLOWLOG.

In addition to your suggestions, do you see any other improvements you could make to try to avoid the error?

Thanks.

mgravell commented 4 months ago

a basic implementation of Chunk is basically shown here: https://github.com/StackExchange/StackExchange.Redis/issues/37#issuecomment-42563183