Open viviancui59 opened 6 years ago
I try to use myself code to compress these dataset, but those compressed datasets' sizes are the same with each other, so I try to use GZIP to compress, the results are identical.
How did you use gzip to compress files?
Did you compress directly on the generated .txt
text files?
I am wondering how much the encoding schema will affect the final comparision with NN_compression.
How did you use gzip to compress files? Did you compress directly on the generated
.txt
text files? I am wondering how much the encoding schema will affect the final comparision with NN_compression.
How to compute compress ratio by loss fuction?师兄
Based on my experiments, gzip should be able to compress markovity 10 dataset much better than markovity 50.
Can you post the command you used for generate the files (with the options).
@AnsonHooL: The loss value is compression ratio on the batch (without accounting for the 2/N factor for arithmetic coding)
Should I try higher markovity dataset to compress using Gzip? The datasets are all generate by your generating code.