ZiggyCreatures / FusionCache

FusionCache is an easy to use, fast and robust hybrid cache with advanced resiliency features.
MIT License
1.71k stars 90 forks source link

[FEATURE] How about add compress support for large entries in distributed cache ? #240

Closed viniciusvarzea closed 4 months ago

viniciusvarzea commented 5 months ago

Problem

Redis is not created for handle large values (100KB can be a large value in some environments)

Solution

If we can specify a parameter that enable compression (Gzip, Brotli, Lz4), the serializer can utilize it to compress the data before send it to redis, saving bandwidth and speeding up the redis in general.

As a example, the library MessagePack-CSharp (https://github.com/MessagePack-CSharp/MessagePack-CSharp) uses a LZ4 library for compression.

viniciusvarzea commented 5 months ago

The problem is bigger when you try to save large lists of entities to redis.

jodydonetti commented 4 months ago

Solution

If we can specify a parameter that enable compression (Gzip, Brotli, Lz4), the serializer can utilize it to compress the data before send it to redis, saving bandwidth and speeding up the redis in general.

I thought about this in the past, and never actually got to it, but my thoughts in general were that it should probably be something that the serializer should do, not FusionCache. A serializer's job is to transform an object instance to a flow of bytes, therefore I see it there.

As a example, the library MessagePack-CSharp (https://github.com/MessagePack-CSharp/MessagePack-CSharp) uses a LZ4 library for compression.

Exactly my point: isn't it the job of the serializer?

I'm open to a discussion of course: throw out ideas, different point of views and rationales.

viniciusvarzea commented 4 months ago

@jodydonetti, since we can implement our own serializers, i think you are right.