marcoCasamento / Hangfire.Redis.StackExchange

HangFire Redis storage based on original (and now unsupported) Hangfire.Redis but using lovely StackExchange.Redis client
Other
456 stars 109 forks source link

Compress data on Redis #9

Closed marcoCasamento closed 8 years ago

marcoCasamento commented 9 years ago

Most string value could efficiently be compressed to save valuable memory on Redis. The problem arise especially when Hangfire uses batches and so it's particulraly exposed to process literally millions of very small jobs. Not everything of course worth the compression time, let's agree on which to compress and how:

I'm not a compression expert so I'd like to receive comments from someone who is, my opinion is that compressing within the context of a single string isn't a viable solution. Let's make an example. I use to decorate my job classes with an attribute that capture current principal info (name and claims) and store them as JobParameter in each job in order to restore security info during job execution. Each job hashset so usually contains at least the claims of the enqueing user:

[["http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name","myUserName"],
["http://schemas.microsoft.com/ws/2008/06/identity/claims/role","oneOftheRoles"],

along with other security relevants ones. Giving that a single user could literally enqueue tens of thousands jobs, I think that compressing the string "one by one" wouldn't be as efficient as having a sort of "shared dictionary" for all jobs. However, I didn't do a single test till now, neither I know if this is reasonable at all.