twitter / hadoop-lzo

Refactored version of code.google.com/hadoop-gpl-compression for hadoop 0.20
GNU General Public License v3.0
545 stars 328 forks source link

fail to build hadoop-lzo on Ubuntu 14.04 with oracle-java8 #93

Closed EugenePig closed 10 years ago

EugenePig commented 10 years ago

pigpigpig@pigpigpig:~/code/hadoop-lzo$ mvn clean package -Dmaven.test.skip=true [INFO] Scanning for projects... [INFO]
[INFO] ------------------------------------------------------------------------ [INFO] Building hadoop-lzo 0.4.20-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-lzo --- [INFO] Deleting /home/pigpigpig/code/hadoop-lzo/target [INFO] [INFO] --- maven-antrun-plugin:1.7:run (check-platform) @ hadoop-lzo --- [INFO] Executing tasks

check-platform: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (set-props-non-win) @ hadoop-lzo --- [INFO] Executing tasks

set-props-non-win: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (set-props-win) @ hadoop-lzo --- [INFO] Executing tasks

set-props-win: [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ hadoop-lzo --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /home/pigpigpig/code/hadoop-lzo/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-lzo --- [INFO] Compiling 25 source files to /home/pigpigpig/code/hadoop-lzo/target/classes [WARNING] bootstrap class path not set in conjunction with -source 1.6 /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/DistributedLzoIndexer.java:[41,20] [deprecation] isDir() in FileStatus has been deprecated [WARNING] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/DistributedLzoIndexer.java:[87,14] [deprecation] Job(Configuration) in Job has been deprecated [WARNING] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:[82,18] [deprecation] isDir() in FileStatus has been deprecated [WARNING] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/mapreduce/LzoIndexOutputFormat.java:[31,28] [deprecation] cleanupJob(JobContext) in OutputCommitter has been deprecated [INFO] [INFO] --- maven-antrun-plugin:1.7:run (build-info-non-win) @ hadoop-lzo --- [INFO] Executing tasks

build-info-non-win: [propertyfile] Creating new property file: /home/pigpigpig/code/hadoop-lzo/target/classes/build.properties [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (build-info-win) @ hadoop-lzo --- [INFO] Executing tasks

build-info-win: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (check-native-uptodate-non-win) @ hadoop-lzo --- [INFO] Executing tasks

check-native-uptodate-non-win: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (check-native-uptodate-win) @ hadoop-lzo --- [INFO] Executing tasks

check-native-uptodate-win: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (build-native-non-win) @ hadoop-lzo --- [INFO] Executing tasks

build-native-non-win: [mkdir] Created dir: /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib [mkdir] Created dir: /home/pigpigpig/code/hadoop-lzo/target/classes/native/Linux-amd64-64/lib [mkdir] Created dir: /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo [javah] [Forcefully writing file RegularFileObject[/home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor.h]] [javah] [Forcefully writing file RegularFileObject[/home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor_CompressionStrategy.h]] [javah] [Forcefully writing file RegularFileObject[/home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor.h]] [javah] [Forcefully writing file RegularFileObject[/home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor_CompressionStrategy.h]] [exec] checking for a BSD-compatible install... /usr/bin/install -c [exec] checking whether build environment is sane... yes [exec] checking for a thread-safe mkdir -p... /bin/mkdir -p [exec] checking for gawk... no [exec] checking for mawk... mawk [exec] checking whether make sets $(MAKE)... yes [exec] checking whether to enable maintainer-specific portions of Makefiles... no [exec] checking for style of include used by make... GNU [exec] checking for gcc... gcc [exec] checking whether the C compiler works... yes [exec] checking for C compiler default output file name... a.out [exec] checking for suffix of executables... [exec] checking whether we are cross compiling... no [exec] checking for suffix of object files... o [exec] checking whether we are using the GNU C compiler... yes [exec] checking whether gcc accepts -g... yes [exec] checking for gcc option to accept ISO C89... none needed [exec] checking dependency style of gcc... gcc3 [exec] checking how to run the C preprocessor... gcc -E [exec] checking for grep that handles long lines and -e... /bin/grep [exec] checking for egrep... /bin/grep -E [exec] checking for ANSI C header files... yes [exec] checking for sys/types.h... yes [exec] checking for sys/stat.h... yes [exec] checking for stdlib.h... yes [exec] checking for string.h... yes [exec] checking for memory.h... yes [exec] checking for strings.h... yes [exec] checking for inttypes.h... yes [exec] checking for stdint.h... yes [exec] checking for unistd.h... yes [exec] checking minix/config.h usability... no [exec] checking minix/config.h presence... no [exec] checking for minix/config.h... no [exec] checking whether it is safe to define EXTENSIONS... yes [exec] checking for gcc... (cached) gcc [exec] checking whether we are using the GNU C compiler... (cached) yes [exec] checking whether gcc accepts -g... (cached) yes [exec] checking for gcc option to accept ISO C89... (cached) none needed [exec] checking dependency style of gcc... (cached) gcc3 [exec] checking build system type... x86_64-unknown-linux-gnu [exec] checking host system type... x86_64-unknown-linux-gnu [exec] checking for a sed that does not truncate output... /bin/sed [exec] checking for fgrep... /bin/grep -F [exec] checking for ld used by gcc... /usr/bin/ld [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes [exec] checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B [exec] checking the name lister (/usr/bin/nm -B) interface... BSD nm [exec] checking whether ln -s works... yes [exec] checking the maximum length of command line arguments... 1572864 [exec] checking whether the shell understands some XSI constructs... yes [exec] checking whether the shell understands "+="... yes [exec] checking for /usr/bin/ld option to reload object files... -r [exec] checking for objdump... objdump [exec] checking how to recognize dependent libraries... pass_all [exec] checking for ar... ar [exec] checking for strip... strip [exec] checking for ranlib... ranlib [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok [exec] checking for dlfcn.h... yes [exec] checking for objdir... .libs [exec] checking if gcc supports -fno-rtti -fno-exceptions... no [exec] checking for gcc option to produce PIC... -fPIC -DPIC [exec] checking if gcc PIC flag -fPIC -DPIC works... yes [exec] checking if gcc static flag -static works... yes [exec] checking if gcc supports -c -o file.o... yes [exec] checking if gcc supports -c -o file.o... (cached) yes [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes [exec] checking whether -lc should be explicitly linked in... no [exec] checking dynamic linker characteristics... GNU/Linux ld.so [exec] checking how to hardcode library paths into programs... immediate [exec] checking whether stripping libraries is possible... yes [exec] checking if libtool supports shared libraries... yes [exec] checking whether to build shared libraries... yes [exec] checking whether to build static libraries... yes [exec] checking for dlopen in -ldl... yes [exec] checking for unistd.h... (cached) yes [exec] checking stdio.h usability... yes [exec] checking stdio.h presence... yes [exec] checking for stdio.h... yes [exec] checking stddef.h usability... yes [exec] checking stddef.h presence... yes [exec] checking for stddef.h... yes [exec] checking lzo/lzo2a.h usability... yes [exec] checking lzo/lzo2a.h presence... yes [exec] checking for lzo/lzo2a.h... yes [exec] checking Checking for the 'actual' dynamic-library for '-llzo2'... "liblzo2.so.2" [exec] checking for special C compiler options needed for large files... no [exec] checking for _FILE_OFFSET_BITS value needed for large files... no [exec] checking for stdbool.h that conforms to C99... yes [exec] checking for _Bool... yes [exec] checking for an ANSI C-conforming const... yes [exec] checking for off_t... yes [exec] checking for size_t... yes [exec] checking whether strerror_r is declared... yes [exec] checking for strerror_r... yes [exec] checking whether strerrorr returns char ... yes [exec] checking for mkdir... yes [exec] checking for uname... yes [exec] checking for memset... yes [exec] checking for JNIGetCreatedJavaVMs in -ljvm... yes [exec] checking jni.h usability... yes [exec] checking jni.h presence... yes [exec] checking for jni.h... yes [exec] configure: creating ./config.status [exec] config.status: creating Makefile [exec] config.status: creating impl/config.h [exec] config.status: executing depfiles commands [exec] config.status: executing libtool commands [exec] depbase=`echo impl/lzo/LzoCompressor.lo | sed 's|[^/]$|.deps/&|;s|.lo$||';\ [exec] /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF $depbase.Tpo -c -o impl/lzo/LzoCompressor.lo /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoCompressor.c &&\ [exec] mv -f $depbase.Tpo $depbase.Plo [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF impl/lzo/.deps/LzoCompressor.Tpo -c /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoCompressor.c -fPIC -DPIC -o impl/lzo/.libs/LzoCompressor.o [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoCompressor.lo -MD -MP -MF impl/lzo/.deps/LzoCompressor.Tpo -c /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoCompressor.c -o impl/lzo/LzoCompressor.o >/dev/null 2>&1 [exec] depbase=echo impl/lzo/LzoDecompressor.lo | sed 's|[^/]*$|.deps/&|;s|.lo$||';\ [exec] /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF $depbase.Tpo -c -o impl/lzo/LzoDecompressor.lo /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoDecompressor.c &&\ [exec] mv -f $depbase.Tpo $depbase.Plo [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF impl/lzo/.deps/LzoDecompressor.Tpo -c /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoDecompressor.c -fPIC -DPIC -o impl/lzo/.libs/LzoDecompressor.o [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/home/pigpigpig/code/hadoop-lzo/src/main/native -I./impl -I/usr/lib/jvm/java-8-oracle/include -I/usr/lib/jvm/java-8-oracle/include/linux -I/home/pigpigpig/code/hadoop-lzo/src/main/native/impl -Isrc/com/hadoop/compression/lzo -g -Wall -fPIC -O2 -m64 -g -O2 -MT impl/lzo/LzoDecompressor.lo -MD -MP -MF impl/lzo/.deps/LzoDecompressor.Tpo -c /home/pigpigpig/code/hadoop-lzo/src/main/native/impl/lzo/LzoDecompressor.c -o impl/lzo/LzoDecompressor.o >/dev/null 2>&1 [exec] /bin/bash ./libtool --tag=CC --mode=link gcc -g -Wall -fPIC -O2 -m64 -g -O2 -L/usr/lib/jvm/java-8-oracle/jre/lib/amd64/server -Wl,--no-as-needed -o libgplcompression.la -rpath /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/../install/lib impl/lzo/LzoCompressor.lo impl/lzo/LzoDecompressor.lo -ljvm -ldl [exec] libtool: link: gcc -shared impl/lzo/.libs/LzoCompressor.o impl/lzo/.libs/LzoDecompressor.o -L/usr/lib/jvm/java-8-oracle/jre/lib/amd64/server -ljvm -ldl -m64 -Wl,--no-as-needed -Wl,-soname -Wl,libgplcompression.so.0 -o .libs/libgplcompression.so.0.0.0 [exec] libtool: link: (cd ".libs" && rm -f "libgplcompression.so.0" && ln -s "libgplcompression.so.0.0.0" "libgplcompression.so.0") [exec] libtool: link: (cd ".libs" && rm -f "libgplcompression.so" && ln -s "libgplcompression.so.0.0.0" "libgplcompression.so") [exec] libtool: link: ar cru .libs/libgplcompression.a impl/lzo/LzoCompressor.o impl/lzo/LzoDecompressor.o [exec] libtool: link: ranlib .libs/libgplcompression.a [exec] libtool: link: ( cd ".libs" && rm -f "libgplcompression.la" && ln -s "../libgplcompression.la" "libgplcompression.la" ) [exec] libtool: install: cp /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/.libs/libgplcompression.so.0.0.0 /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib/libgplcompression.so.0.0.0 [exec] libtool: install: warning: remember to runlibtool --finish /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/../install/lib' [exec] libtool: install: (cd /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib && { ln -s -f libgplcompression.so.0.0.0 libgplcompression.so.0 || { rm -f libgplcompression.so.0 && ln -s libgplcompression.so.0.0.0 libgplcompression.so.0; }; }) [exec] libtool: install: (cd /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib && { ln -s -f libgplcompression.so.0.0.0 libgplcompression.so || { rm -f libgplcompression.so && ln -s libgplcompression.so.0.0.0 libgplcompression.so; }; }) [exec] libtool: install: cp /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/.libs/libgplcompression.lai /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib/libgplcompression.la [exec] libtool: install: cp /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/.libs/libgplcompression.a /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib/libgplcompression.a [exec] libtool: install: chmod 644 /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib/libgplcompression.a [exec] libtool: install: ranlib /home/pigpigpig/code/hadoop-lzo/target/native/Linux-amd64-64/lib/libgplcompression.a [copy] Copying 5 files to /home/pigpigpig/code/hadoop-lzo/target/classes/native/Linux-amd64-64/lib [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (build-native-win) @ hadoop-lzo --- [INFO] Executing tasks

build-native-win: [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.3:testResources (default-testResources) @ hadoop-lzo --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 12 resources [INFO] [INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-lzo --- [INFO] Not compiling test sources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (prep-test) @ hadoop-lzo --- [INFO] Executing tasks

prep-test: [mkdir] Created dir: /home/pigpigpig/code/hadoop-lzo/target/test-classes/logs [INFO] Executed tasks [INFO] [INFO] --- maven-surefire-plugin:2.14.1:test (default-test) @ hadoop-lzo --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hadoop-lzo --- [INFO] Building jar: /home/pigpigpig/code/hadoop-lzo/target/hadoop-lzo-0.4.20-SNAPSHOT.jar [INFO] [INFO] >>> maven-source-plugin:2.2.1:jar (attach-sources) @ hadoop-lzo >>> [INFO] [INFO] --- maven-antrun-plugin:1.7:run (check-platform) @ hadoop-lzo --- [INFO] Executing tasks

check-platform: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (set-props-non-win) @ hadoop-lzo --- [INFO] Executing tasks

set-props-non-win: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (set-props-win) @ hadoop-lzo --- [INFO] Executing tasks

set-props-win: [INFO] Executed tasks [INFO] [INFO] <<< maven-source-plugin:2.2.1:jar (attach-sources) @ hadoop-lzo <<< [INFO] [INFO] --- maven-source-plugin:2.2.1:jar (attach-sources) @ hadoop-lzo --- [INFO] Building jar: /home/pigpigpig/code/hadoop-lzo/target/hadoop-lzo-0.4.20-SNAPSHOT-sources.jar [INFO] [INFO] --- maven-javadoc-plugin:2.9:jar (attach-javadocs) @ hadoop-lzo --- [INFO] Loading source files for package com.hadoop.mapreduce... Loading source files for package com.hadoop.compression.lzo... Loading source files for package com.hadoop.compression.lzo.util... Loading source files for package com.hadoop.mapred... Loading source files for package com.quicklz... Loading source files for package org.apache.hadoop.io.compress... Constructing Javadoc information... Standard Doclet version 1.8.0_20 Building tree for all the packages and classes... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoIndexOutputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoIndexRecordWriter.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoLineRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoSplitInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoSplitRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoSplitRecordReader.Counters.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/LzoTextInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/CChecksum.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/DChecksum.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/DistributedLzoIndexer.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/GPLNativeCodeLoader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzoCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzoIndex.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzoIndexer.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzoInputFormatCommon.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzopCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzopDecompressor.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzopInputStream.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/LzopOutputStream.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/CompatibilityUtil.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/DeprecatedLzoLineRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/DeprecatedLzoTextInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/QuickLZ.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/LzoCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/overview-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/package-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/package-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/package-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/constant-values.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoIndexRecordWriter.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoLineRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoSplitRecordReader.Counters.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoIndexOutputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/class-use/LzoTextInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzopDecompressor.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzopInputStream.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/GPLNativeCodeLoader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzopCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzopOutputStream.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzoIndex.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/DistributedLzoIndexer.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/CChecksum.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzoInputFormatCommon.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/DChecksum.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzoIndexer.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/class-use/LzoCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/class-use/CompatibilityUtil.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/class-use/DeprecatedLzoTextInputFormat.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/class-use/DeprecatedLzoLineRecordReader.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/class-use/QuickLZ.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/class-use/LzoCodec.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/package-use.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/compression/lzo/util/package-use.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapred/package-use.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/hadoop/mapreduce/package-use.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/com/quicklz/package-use.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/org/apache/hadoop/io/compress/package-use.html... Building index for all the packages and classes... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/overview-tree.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/index-all.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/deprecated-list.html... Building index for all classes... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/allclasses-frame.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/allclasses-noframe.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/index.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/overview-summary.html... Generating /home/pigpigpig/code/hadoop-lzo/target/apidocs/help-doc.html... 7 errors 34 warnings [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 19.251s [INFO] Finished at: Thu Sep 18 16:26:01 CST 2014 [INFO] Final Memory: 25M/216M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.9:jar (attach-javadocs) on project hadoop-lzo: MavenReportException: Error while creating archive: [ERROR] Exit code: 1 - /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoCodec.java:86: error: bad HTML entity [ERROR] * Check if native-lzo library is loaded & initialized. [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoCodec.java:89: error: bad HTML entity [ERROR] * @return true if native-lzo library is loaded & initialized; [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:73: warning: no @return [ERROR] public int getNumberOfBlocks() { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:79: warning: no description for @param [ERROR] * @param block [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:82: error: malformed HTML [ERROR] * The argument block should satisfy 0 <= block < getNumberOfBlocks(). [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:82: error: malformed HTML [ERROR] * The argument block should satisfy 0 <= block < getNumberOfBlocks(). [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:166: warning: no description for @throws [ERROR] * @throws IOException [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:168: warning: no @return [ERROR] public static LzoIndex readIndex(FileSystem fs, Path lzoFile) throws IOException { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndex.java:203: warning: no description for @throws [ERROR] * @throws IOException [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:48: error: @param name not found [ERROR] * @param lzoUri The file to index. [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:49: warning: no description for @throws [ERROR] * @throws IOException [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:51: warning: no @param for lzoPath [ERROR] public void index(Path lzoPath) throws IOException { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoIndexer.java:128: warning: no @param for args [ERROR] public static void main(String[] args) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:42: warning: no @param for dflags [ERROR] public void initHeaderFlags(EnumSet dflags, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:42: warning: no @param for cflags [ERROR] public void initHeaderFlags(EnumSet dflags, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:97: warning: no @param for typ [ERROR] public synchronized boolean verifyDChecksum(DChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:97: warning: no @param for checksum [ERROR] public synchronized boolean verifyDChecksum(DChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:97: warning: no @return [ERROR] public synchronized boolean verifyDChecksum(DChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:105: warning: no @param for typ [ERROR] public synchronized boolean verifyCChecksum(CChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:105: warning: no @param for checksum [ERROR] public synchronized boolean verifyCChecksum(CChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:105: warning: no @return [ERROR] public synchronized boolean verifyCChecksum(CChecksum typ, int checksum) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopDecompressor.java:35: warning: no @param for bufferSize [ERROR] public LzopDecompressor(int bufferSize) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzoDecompressor.java:169: error: bad HTML entity [ERROR] * @return true if lzo decompressors are loaded & initialized, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopInputStream.java:113: warning: no @param for in [ERROR] protected void readHeader(InputStream in) throws IOException { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopInputStream.java:113: warning: no @throws for java.io.IOException [ERROR] protected void readHeader(InputStream in) throws IOException { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopOutputStream.java:41: warning: no @param for out [ERROR] protected static void writeLzopHeader(OutputStream out, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopOutputStream.java:41: warning: no @param for strategy [ERROR] protected static void writeLzopHeader(OutputStream out, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/LzopOutputStream.java:41: warning: no @throws for java.io.IOException [ERROR] protected static void writeLzopHeader(OutputStream out, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:87: warning: no @return [ERROR] public static boolean isVersion2x() { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:107: warning: no @param for conf [ERROR] public static TaskAttemptContext newTaskAttemptContext(Configuration conf, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:107: warning: no @param for id [ERROR] public static TaskAttemptContext newTaskAttemptContext(Configuration conf, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:107: warning: no @return [ERROR] public static TaskAttemptContext newTaskAttemptContext(Configuration conf, [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:127: warning: no @param for context [ERROR] public static Configuration getConfiguration(JobContext context) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:127: warning: no @return [ERROR] public static Configuration getConfiguration(JobContext context) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:135: warning: no @param for context [ERROR] public static Counter getCounter(TaskInputOutputContext context, Enum<?> counter) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:135: warning: no @param for counter [ERROR] public static Counter getCounter(TaskInputOutputContext context, Enum<?> counter) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:135: warning: no @return [ERROR] public static Counter getCounter(TaskInputOutputContext context, Enum<?> counter) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:142: warning: no @param for counter [ERROR] public static void incrementCounter(Counter counter, long increment) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:142: warning: no @param for increment [ERROR] public static void incrementCounter(Counter counter, long increment) { [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:147: error: bad HTML entity [ERROR] * Hadoop 1 & 2 compatible counter.getValue() [ERROR] ^ [ERROR] /home/pigpigpig/code/hadoop-lzo/src/main/java/com/hadoop/compression/lzo/util/CompatibilityUtil.java:150: warning: no @param for counter [ERROR] public static long getCounterValue(Counter counter) { [ERROR] ^ [ERROR] [ERROR] Command line was: /usr/lib/jvm/java-8-oracle/jre/../bin/javadoc @options @packages [ERROR] [ERROR] Refer to the generated Javadoc files in '/home/pigpigpig/code/hadoop-lzo/target/apidocs' dir. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

sjlee commented 10 years ago

Thanks for reporting the issue @EugenePig. Please note that Java 8 is not a supported platform for compiling the hadoop-lzo source.

Having said that, this is due to JDK 8 doing more checks when processing javadoc (e.g. see http://stackoverflow.com/questions/15886209/maven-is-not-working-in-java-8-when-javadoc-tags-are-incomplete). I think the only way to fix this is really to fix all these javadoc issues.

If you'd like to provide a pull request for this, I'd be happy to look at it and merge it. Thanks!

EugenePig commented 10 years ago

I checked in some codes. Is it the way what you prefer?

sjlee commented 10 years ago

Thanks. Could you kindly create a new pull request?