taoensso / carmine

Redis client + message queue for Clojure
https://www.taoensso.com/carmine
Eclipse Public License 1.0
1.16k stars 131 forks source link

wcar replies as a lazy sequence #267

Closed Coder-DG closed 2 years ago

Coder-DG commented 2 years ago

When I try to use the hset command with a pipeline (with wcar ... :as-pipeline) and when it comes to a very large amount of calls the client throws java out of memory on the reply transient vector creation.

I have 8831306 strings that represent keys (each is 36 chars long), the field is "placeholder" and value is also "placeholder". Here's the code sketch I'm using:


(defn -hset-keys
  [conn-pool keys-kval ttl]
  (car/wcar
      conn-pool
      :as-pipeline
      (doseq [[rkey hkey hval] keys-kval]
        (car/hset rkey hkey hval))
      (when ttl
        (doseq [[rkey _ _] keys-kval]
          (car/expire rkey ttl)))))

How can I force the replies vector from wcar to be a lazy sequence? Or perhaps turn off the replies for these specific calls (i.e. never build the replies vector to begin with)?

Thank you!

ptaoussanis commented 2 years ago

@MrJazzPotato Hi there,

How can I force the replies vector from wcar to be a lazy sequence? Or perhaps turn off the replies for these specific calls (i.e. never build the replies vector to begin with)?

Neither of these is currently possible. In particular I'd note that Redis itself doesn't currently allow the suppression of replies afaik - though I guess that might be possible via a Lua script or module.

What I'd recommend in your case is just to batch your commands to limit your maximum pipeline size. You can do this easily with partition-all. So something like this:

(doseq [batch (partition-all 1000 keys-kval) ; Limit 1000 commands per pipeline
        [rkey hkey hval] batch]
  (wcar
    (car/hset rkey hkey hval)
    (when ttl (car/expire rkey ttl))))

Hope that helps! Cheers :-)

ptaoussanis commented 2 years ago

You may also want to check this: https://redis.io/commands/client-reply/

Coder-DG commented 2 years ago

Yeah I eventually used batches to not overload my responses. I also stumbled upon https://redis.io/commands/client-reply/ which might be of use.

@ptaoussanis is the reply as a lazy seq not possible due to Redis' protocol? Or is it more a Clojure constraint? Or just not implemented and could be a FR?

Thanks

ptaoussanis commented 2 years ago

@MrJazzPotato

Some kind of laziness may be possible, but my first inclination is that it wouldn't be a good idea for several reasons. These would include worse performance, additional complexity, and most importantly - would mean potentially leaving open Redis connections in a hung state while waiting for the lazy sequence to be realised. And what if the sequence is never realised? I suspect this could also have implications for the Redis server.

Before going down the road of discussing if/how laziness might be possible, I'd prefer we start with a clear motivation. What problem are you trying to solve exactly?

Coder-DG commented 2 years ago

Perform a few million calls to Redis, updating HSETs with some data. I have a couple of workers that fetch said from some place else lazily, so was trying to see what's the best way to do this without running out of memory.

I understand now the implications and agree that what I suggested isn't the right approach. I'll just keep it in batched mode, this is a pretty good approach.

I think we can close the issue

-------- Original Message -------- On Jun 3, 2022, 01:13, Peter Taoussanis wrote:

@.***(https://github.com/MrJazzPotato)

Some kind of laziness may be possible, but my first inclination is that it wouldn't be a good idea for several reasons. These would include worse performance, additional complexity, and most importantly - would mean potentially leaving open Redis connections in a hung state while waiting for the lazy sequence to be realised. And what if the sequence is never realised? I suspect this could also have implications for the Redis server.

Before going down the road of discussing if/how laziness might be possible, I'd prefer we start with a clear motivation. What problem are you trying to solve exactly?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>