Closed awalsh128 closed 2 years ago
Looks like the main citing of this performance is from FB engineering (who designed it) which looks impressive. https://engineering.fb.com/2016/08/31/core-data/smaller-and-faster-data-compression-with-zstandard The tested corpus matches our use case too (binaries < 51MB).
Just curious, have you observed this performance increase in the wild too @tamascsaba?
Yes the decompress speed of zstd is amazing 👍
I checked and you use action/cache
and maybe it is worth it to compress with zstd or gzip or anything else, because action/cache
compress with zstd by default and you will compress twice.
Ah good point. I think you mean to not compress?
Taking a look at the performance differences is worth it in my opinion. For example try to install a bigger package and cache without compress or only use tar or compress with zstd or compress with gzip.
Was thinking this over and may omit compression entirely since the cache action already performs compression. This would reduce the cycles needed to backup the files too. May run a benchmark to test this hypothesis against a fairly large package.
Did some testing and concluded it would be faster to remove compression altogether, especially considering that the actions/cache already does this.
Mentioned in issue #45 by @tamascsaba
Will perform some due diligence and update this when I get a chance.