Intel-bigdata / SSM

Smart Storage Management for Big Data, a comprehensive hot/cold data optimized solution
Apache License 2.0
133 stars 67 forks source link

Fail to decompress some kinds of gz files #2192

Closed lipppppp closed 4 years ago

lipppppp commented 4 years ago

I found some problems when I used ssm compression. I write a file which length is 100000000B by ssm action. Then I compress and decompress this file. The decompress action failed. image image But when I create a new file which is smaller than the buffer size, the compress and decompress actions succeed. There are two problems. 1.Why compressed file length is larger than itself? 2.What causes decompression failure? image

PHILO-HE commented 4 years ago

@lipppppp, it seems this feature can work well for file generated by other ways, not by SSM write action.

PHILO-HE commented 4 years ago

image image

lipppppp commented 4 years ago

Thanks for @PHILO-HE's support. This issue seems unimportant in the main scenario. We could leave this issue until it's in need.

PHILO-HE commented 4 years ago

SSM write file action is just for test use, with random bytes written to that file. For compression codec, the header will be checked, which will report this error if unexpected header is found. Yes, considering compressing a common file works well, we can close this issue and reopen it if there is a strong requirement.