fingltd / 4mc

4mc - splittable lz4 and zstd in hadoop/spark/flink
Other
108 stars 36 forks source link

Commit 17c36e08653162f6956ef27f55c96c20425df4b8 breaks test units #28

Closed trixpan closed 6 years ago

trixpan commented 6 years ago
testZstCodec(com.hadoop.compression.fourmc.TestFourMcCodec)  Time elapsed: 0.034 sec  <<< ERROR!
java.lang.UnsatisfiedLinkError: com.hadoop.compression.fourmc.zstd.Zstd.cStreamInSize()J
        at com.hadoop.compression.fourmc.zstd.Zstd.cStreamInSize(Native Method)
        at com.hadoop.compression.fourmc.zstd.ZstdStreamCompressor.<clinit>(ZstdStreamCompressor.java:66)
        at com.hadoop.compression.fourmc.ZstCodec.<clinit>(ZstCodec.java:69)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
$ rpm -qa | grep zstd
libzstd-1.3.1-1.el7.x86_64
zstd-1.3.1-1.el7.x86_64
trixpan commented 6 years ago

Issue seems to be caused by the fact the native java/hadoop-4mc/src/main/resources/com/hadoop/compression/fourmc/linux/amd64/libhadoop-4mc.so is still the old version (i.e. without native zStd support).

To workaround one must first compile the native binaries, using:

cd native
make libhadoop-4mc

Then copy the output file to the java path

cp libhadoop-4mc.so.1.1.0 ../java/hadoop-4mc/src/main/resources/com/hadoop/compression/fourmc/linux/amd64/libhadoop-4mc.so

And only the run maven