Closed viniciusvarzea closed 4 months ago
The problem is bigger when you try to save large lists of entities to redis.
Solution
If we can specify a parameter that enable compression (Gzip, Brotli, Lz4), the serializer can utilize it to compress the data before send it to redis, saving bandwidth and speeding up the redis in general.
I thought about this in the past, and never actually got to it, but my thoughts in general were that it should probably be something that the serializer should do, not FusionCache.
A serializer's job is to transform an object instance to a flow of byte
s, therefore I see it there.
As a example, the library MessagePack-CSharp (https://github.com/MessagePack-CSharp/MessagePack-CSharp) uses a LZ4 library for compression.
Exactly my point: isn't it the job of the serializer?
I'm open to a discussion of course: throw out ideas, different point of views and rationales.
@jodydonetti, since we can implement our own serializers, i think you are right.
Problem
Redis is not created for handle large values (100KB can be a large value in some environments)
Solution
If we can specify a parameter that enable compression (Gzip, Brotli, Lz4), the serializer can utilize it to compress the data before send it to redis, saving bandwidth and speeding up the redis in general.
As a example, the library MessagePack-CSharp (https://github.com/MessagePack-CSharp/MessagePack-CSharp) uses a LZ4 library for compression.