crs4 / pydoop

A Python MapReduce and HDFS API for Hadoop
Apache License 2.0
236 stars 59 forks source link

"Error compiling Java component" on pip installation of 2.0a3 #327

Closed abolotnov closed 5 years ago

abolotnov commented 6 years ago

Python 3.6, brew installed hadoop 3.1.1 and 1.8 JDK.

I hope I got all the envs right:

aleksandrs-mbp:pydoop sasha$ printenv|grep -e HADOOP -e JAVA HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1/libexec HADOOP_COMMON_HOME=/usr/local/Cellar/hadoop/3.1.1/ HADOOP_VERSION=3.1.1 JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home HADOOP_CONF_DIR=/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop

Then on install, the following happens (same thing happens on building from source): I think it's all going downhill on "WARNING: could not set classpath, java code may not compile"... is this an environment variable not set properly or something else?

aleksandrs-mbp:pydoop sasha$ pip install pydoop==2.0a3

Collecting pydoop==2.0a3
  Using cached https://files.pythonhosted.org/packages/be/c0/313f1ea502268474112d4b37df369b8d197021d25779687865fb638ace59/pydoop-2.0a3.tar.gz
Requirement already satisfied: setuptools>=3.3 in /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages (from pydoop==2.0a3) (39.2.0)
Installing collected packages: pydoop
  Running setup.py install for pydoop ... error
    Complete output from command /Library/Frameworks/Python.framework/Versions/3.6/bin/python3.6 -u -c "import setuptools, tokenize;__file__='/private/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pip-install-ymm1qorf/pydoop/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /private/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pip-record-tkpiirug/install-record.txt --single-version-externally-managed --compile:
    using setuptools version 39.2.0
    running install
    running build
    running build_py
    creating build
    creating build/lib.macosx-10.6-intel-3.6
    creating build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/test_support.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/test_utils.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/config.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/version.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/avrolib.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/hadoop_utils.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/hadut.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    copying pydoop/jc.py -> build/lib.macosx-10.6-intel-3.6/pydoop
    creating build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/submit.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/script_template.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/argparse_types.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/main.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    copying pydoop/app/script.py -> build/lib.macosx-10.6-intel-3.6/pydoop/app
    creating build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/serialize.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/misc.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/py3compat.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/jvm.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    copying pydoop/utils/conversion_tables.py -> build/lib.macosx-10.6-intel-3.6/pydoop/utils
    creating build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    copying pydoop/hdfs/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    copying pydoop/hdfs/file.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    copying pydoop/hdfs/common.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    copying pydoop/hdfs/fs.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    copying pydoop/hdfs/path.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs
    creating build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/streams.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/binary_streams.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/jwritable_utils.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/connections.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/api.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/string_utils.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/pipes.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/simulator.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    copying pydoop/mapreduce/text_streams.py -> build/lib.macosx-10.6-intel-3.6/pydoop/mapreduce
    creating build/lib.macosx-10.6-intel-3.6/pydoop/hdfs/core
    copying pydoop/hdfs/core/__init__.py -> build/lib.macosx-10.6-intel-3.6/pydoop/hdfs/core
    copying pydoop/pydoop.properties -> build/lib.macosx-10.6-intel-3.6/pydoop
    running build_ext
    checking for TLS support
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders/tb
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T
    creating /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4
    /usr/bin/clang -fno-strict-aliasing -Wsign-compare -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/temp.c -o /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/temp.o
    /var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pydoop_40xx15e4/temp.c:1:25: error: thread-local storage is not supported for the current target
    int main(void) { static __thread int i = 0; return i; }
                            ^
    1 error generated.
    building 'pydoop.native_core_hdfs' extension
    creating build/temp.macosx-10.6-intel-3.6
    creating build/temp.macosx-10.6-intel-3.6/src
    creating build/temp.macosx-10.6-intel-3.6/src/libhdfs
    creating build/temp.macosx-10.6-intel-3.6/src/libhdfs/common
    creating build/temp.macosx-10.6-intel-3.6/src/libhdfs/os
    creating build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix
    creating build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/jni_helper.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/jni_helper.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/exception.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/exception.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/hdfs.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/hdfs.o -Wno-write-strings
    src/libhdfs/hdfs.c:2852:49: warning: incompatible pointer types passing 'tOffset *' (aka 'long long *') to parameter of type 'jlong *' (aka 'long *') [-Wincompatible-pointer-types]
        jthr = getDefaultBlockSize(env, jFS, jPath, &blockSize);
                                                    ^~~~~~~~~~
    src/libhdfs/hdfs.c:826:61: note: passing argument to parameter 'out' here
                                          jobject jPath, jlong *out)
                                                                ^
    1 warning generated.
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/common/htable.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/common/htable.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/os/posix/thread.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/thread.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/os/posix/mutexes.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/mutexes.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/libhdfs/os/posix/thread_local_storage.c -o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/thread_local_storage.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/native_core_hdfs/hdfs_fs.cc -o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_fs.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/native_core_hdfs/hdfs_module.cc -o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_module.o -Wno-write-strings
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -DMACOSX=1 -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include -Inative/jni_include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/lib -I/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/include/darwin -Isrc/libhdfs -Isrc/libhdfs/include -Isrc/libhdfs/os/posix -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/native_core_hdfs/hdfs_file.cc -o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_file.o -Wno-write-strings
    /usr/bin/clang++ -bundle -undefined dynamic_lookup -arch i386 -arch x86_64 -g build/temp.macosx-10.6-intel-3.6/src/libhdfs/jni_helper.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/exception.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/hdfs.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/common/htable.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/thread.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/mutexes.o build/temp.macosx-10.6-intel-3.6/src/libhdfs/os/posix/thread_local_storage.o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_fs.o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_module.o build/temp.macosx-10.6-intel-3.6/src/native_core_hdfs/hdfs_file.o -L/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/Libraries -L/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre/lib/server -ldl -ljvm -o build/lib.macosx-10.6-intel-3.6/pydoop/native_core_hdfs.cpython-36m-darwin.so -Wl,-rpath,/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre/lib/server
    clang: warning: libstdc++ is deprecated; move to libc++ with a minimum deployment target of OS X 10.9 [-Wdeprecated]
    clang: warning: libstdc++ is deprecated; move to libc++ with a minimum deployment target of OS X 10.9 [-Wdeprecated]
    ld: warning: directory not found for option '-L/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/Libraries'
    ld: warning: ignoring file /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre/lib/server/libjvm.dylib, missing required architecture i386 in file /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre/lib/server/libjvm.dylib (1 slices)
    ld: warning: directory not found for option '-L/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/Libraries'
    building 'pydoop.sercore' extension
    creating build/temp.macosx-10.6-intel-3.6/src/serialize
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/sermodule.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/sermodule.o -Wno-write-strings -O3
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/flow.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/flow.o -Wno-write-strings -O3
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/command.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/command.o -Wno-write-strings -O3
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/serialization.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/serialization.o -Wno-write-strings -O3
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/SerialUtils.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/SerialUtils.o -Wno-write-strings -O3
    /usr/bin/clang -fno-strict-aliasing -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch i386 -arch x86_64 -g -UNDEBUG -I/Library/Frameworks/Python.framework/Versions/3.6/include/python3.6m -c src/serialize/StringUtils.cc -o build/temp.macosx-10.6-intel-3.6/src/serialize/StringUtils.o -Wno-write-strings -O3
    /usr/bin/clang++ -bundle -undefined dynamic_lookup -arch i386 -arch x86_64 -g build/temp.macosx-10.6-intel-3.6/src/serialize/sermodule.o build/temp.macosx-10.6-intel-3.6/src/serialize/flow.o build/temp.macosx-10.6-intel-3.6/src/serialize/command.o build/temp.macosx-10.6-intel-3.6/src/serialize/serialization.o build/temp.macosx-10.6-intel-3.6/src/serialize/SerialUtils.o build/temp.macosx-10.6-intel-3.6/src/serialize/StringUtils.o -o build/lib.macosx-10.6-intel-3.6/pydoop/sercore.cpython-36m-darwin.so
    clang: warning: libstdc++ is deprecated; move to libc++ with a minimum deployment target of OS X 10.9 [-Wdeprecated]
    clang: warning: libstdc++ is deprecated; move to libc++ with a minimum deployment target of OS X 10.9 [-Wdeprecated]
    hadoop_home: '/usr/local/Cellar/hadoop/3.1.1/libexec'
    hadoop_version: '3.1.1'
    java_home: '/Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home'
    WARNING: could not set classpath, java code may not compile
    Compiling Java classes
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:24: error: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.Configuration;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:25: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.Writable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:26: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.WritableComparable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:27: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.InputSplit;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:35: error: cannot find symbol
    interface DownwardProtocol<K extends Writable, V extends Writable> {
                                         ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:35: error: cannot find symbol
    interface DownwardProtocol<K extends Writable, V extends Writable> {
                                                             ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:53: error: cannot find symbol
      void setJobConf(Configuration conf) throws IOException;
                      ^
      symbol:   class Configuration
      location: interface DownwardProtocol<K,V>
      where K,V are type-variables:
        K declared in interface DownwardProtocol
        V declared in interface DownwardProtocol
    src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java:70: error: cannot find symbol
      void runMap(InputSplit split, int numReduces,
                  ^
      symbol:   class InputSplit
      location: interface DownwardProtocol<K,V>
      where K,V are type-variables:
        K declared in interface DownwardProtocol
        V declared in interface DownwardProtocol
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:24: error: package org.apache.avro does not exist
    import org.apache.avro.Schema;
                          ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:25: error: package org.apache.avro.generic does not exist
    import org.apache.avro.generic.GenericRecord;
                                  ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:26: error: package org.apache.avro.generic does not exist
    import org.apache.avro.generic.GenericData;
                                  ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:27: error: package org.apache.avro.file does not exist
    import org.apache.avro.file.CodecFactory;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:28: error: package org.apache.avro.hadoop.io does not exist
    import org.apache.avro.hadoop.io.AvroKeyValue;
                                    ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:24: error: package org.apache.avro does not exist
    import org.apache.avro.Schema;
                          ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:25: error: package org.apache.avro.generic does not exist
    import org.apache.avro.generic.GenericRecord;
                                  ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:26: error: package org.apache.avro.generic does not exist
    import org.apache.avro.generic.GenericDatumWriter;
                                  ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:27: error: package org.apache.avro.file does not exist
    import org.apache.avro.file.CodecFactory;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:28: error: package org.apache.avro.file does not exist
    import org.apache.avro.file.DataFileWriter;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:30: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.RecordWriter;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:31: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskAttemptContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:35: error: cannot find symbol
        extends RecordWriter<K, V> {
                ^
      symbol: class RecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:32: error: cannot find symbol
        extends PydoopAvroRecordWriterBase<GenericRecord, GenericRecord> {
                                           ^
      symbol: class GenericRecord
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:32: error: cannot find symbol
        extends PydoopAvroRecordWriterBase<GenericRecord, GenericRecord> {
                                                          ^
      symbol: class GenericRecord
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:34: error: cannot find symbol
      private Schema keyValueSchema;
              ^
      symbol:   class Schema
      location: class PydoopAvroKeyValueRecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:36: error: cannot find symbol
      public PydoopAvroKeyValueRecordWriter(Schema writerSchema,
                                            ^
      symbol:   class Schema
      location: class PydoopAvroKeyValueRecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:37: error: cannot find symbol
          CodecFactory compressionCodec, OutputStream outputStream)
          ^
      symbol:   class CodecFactory
      location: class PydoopAvroKeyValueRecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:44: error: cannot find symbol
      public void write(GenericRecord key, GenericRecord value)
                        ^
      symbol:   class GenericRecord
      location: class PydoopAvroKeyValueRecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java:44: error: cannot find symbol
      public void write(GenericRecord key, GenericRecord value)
                                           ^
      symbol:   class GenericRecord
      location: class PydoopAvroKeyValueRecordWriter
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:37: error: cannot find symbol
      protected final DataFileWriter<GenericRecord> mAvroFileWriter;
                      ^
      symbol:   class DataFileWriter
      location: class PydoopAvroRecordWriterBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroRecordWriterBase
        V extends Object declared in class PydoopAvroRecordWriterBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:37: error: cannot find symbol
      protected final DataFileWriter<GenericRecord> mAvroFileWriter;
                                     ^
      symbol:   class GenericRecord
      location: class PydoopAvroRecordWriterBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroRecordWriterBase
        V extends Object declared in class PydoopAvroRecordWriterBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:39: error: cannot find symbol
      protected PydoopAvroRecordWriterBase(Schema writerSchema,
                                           ^
      symbol:   class Schema
      location: class PydoopAvroRecordWriterBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroRecordWriterBase
        V extends Object declared in class PydoopAvroRecordWriterBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:40: error: cannot find symbol
          CodecFactory compressionCodec, OutputStream outputStream)
          ^
      symbol:   class CodecFactory
      location: class PydoopAvroRecordWriterBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroRecordWriterBase
        V extends Object declared in class PydoopAvroRecordWriterBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java:49: error: cannot find symbol
      public void close(TaskAttemptContext context) throws IOException {
                        ^
      symbol:   class TaskAttemptContext
      location: class PydoopAvroRecordWriterBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroRecordWriterBase
        V extends Object declared in class PydoopAvroRecordWriterBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:23: error: package org.apache.avro does not exist
    import org.apache.avro.Schema;
                          ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:24: error: package org.apache.avro.generic does not exist
    import org.apache.avro.generic.GenericRecord;
                                  ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:25: error: package org.apache.avro.hadoop.io does not exist
    import org.apache.avro.hadoop.io.AvroKeyValue;
                                    ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:27: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.RecordWriter;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:28: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskAttemptContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:24: error: package org.apache.avro does not exist
    import org.apache.avro.Schema;
                          ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:25: error: package org.apache.avro.mapreduce does not exist
    import org.apache.avro.mapreduce.AvroOutputFormatBase;
                                    ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:27: error: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.Configuration;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:28: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskAttemptContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:32: error: cannot find symbol
        extends AvroOutputFormatBase<K, V> {
                ^
      symbol: class AvroOutputFormatBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:32: error: cannot find symbol
        extends PydoopAvroOutputFormatBase<GenericRecord, GenericRecord> {
                                           ^
      symbol: class GenericRecord
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:32: error: cannot find symbol
        extends PydoopAvroOutputFormatBase<GenericRecord, GenericRecord> {
                                                          ^
      symbol: class GenericRecord
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:37: error: cannot find symbol
          TaskAttemptContext context) throws IOException {
          ^
      symbol:   class TaskAttemptContext
      location: class PydoopAvroKeyValueOutputFormat
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:36: error: cannot find symbol
      public RecordWriter<GenericRecord, GenericRecord> getRecordWriter(
             ^
      symbol:   class RecordWriter
      location: class PydoopAvroKeyValueOutputFormat
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:36: error: cannot find symbol
      public RecordWriter<GenericRecord, GenericRecord> getRecordWriter(
                          ^
      symbol:   class GenericRecord
      location: class PydoopAvroKeyValueOutputFormat
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java:36: error: cannot find symbol
      public RecordWriter<GenericRecord, GenericRecord> getRecordWriter(
                                         ^
      symbol:   class GenericRecord
      location: class PydoopAvroKeyValueOutputFormat
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:35: error: cannot find symbol
          TaskAttemptContext context, String propName) throws IOException {
          ^
      symbol:   class TaskAttemptContext
      location: class PydoopAvroOutputFormatBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroOutputFormatBase
        V extends Object declared in class PydoopAvroOutputFormatBase
    src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java:34: error: cannot find symbol
      protected static Schema getOutputSchema(
                       ^
      symbol:   class Schema
      location: class PydoopAvroOutputFormatBase<K,V>
      where K,V are type-variables:
        K extends Object declared in class PydoopAvroOutputFormatBase
        V extends Object declared in class PydoopAvroOutputFormatBase
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:21: error: package org.apache.commons.logging does not exist
    import org.apache.commons.logging.Log;
                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:22: error: package org.apache.commons.logging does not exist
    import org.apache.commons.logging.LogFactory;
                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:23: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.Writable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:24: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.WritableComparable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:25: error: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.Configuration;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:26: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskInputOutputContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:27: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.Reducer;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:28: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.ReduceContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:29: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.MRJobConfig;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:30: error: package org.apache.hadoop.mapred does not exist
    import org.apache.hadoop.mapred.SkipBadRecords;
                                   ^
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:40: error: cannot find symbol
        extends Reducer<K2, V2, K3, V3> {
                ^
      symbol: class Reducer
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:38: error: cannot find symbol
    class PipesReducer<K2 extends WritableComparable, V2 extends Writable,
                                  ^
      symbol: class WritableComparable
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:38: error: cannot find symbol
    class PipesReducer<K2 extends WritableComparable, V2 extends Writable,
                                                                 ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:39: error: cannot find symbol
                       K3 extends WritableComparable, V3 extends Writable>
                                  ^
      symbol: class WritableComparable
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:39: error: cannot find symbol
                       K3 extends WritableComparable, V3 extends Writable>
                                                                 ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:41: error: cannot find symbol
        private static final Log LOG = LogFactory.getLog(PipesReducer.class.getName());
                             ^
      symbol:   class Log
      location: class PipesReducer<K2,V2,K3,V3>
      where K2,V2,K3,V3 are type-variables:
        K2 declared in class PipesReducer
        V2 declared in class PipesReducer
        K3 declared in class PipesReducer
        V3 declared in class PipesReducer
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:42: error: cannot find symbol
        private Context context;
                ^
      symbol:   class Context
      location: class PipesReducer<K2,V2,K3,V3>
      where K2,V2,K3,V3 are type-variables:
        K2 declared in class PipesReducer
        V2 declared in class PipesReducer
        K3 declared in class PipesReducer
        V3 declared in class PipesReducer
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:43: error: cannot find symbol
        private Configuration configuration;
                ^
      symbol:   class Configuration
      location: class PipesReducer<K2,V2,K3,V3>
      where K2,V2,K3,V3 are type-variables:
        K2 declared in class PipesReducer
        V2 declared in class PipesReducer
        K3 declared in class PipesReducer
        V3 declared in class PipesReducer
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:33: error: package org.apache.commons.logging does not exist
    import org.apache.commons.logging.Log;
                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:34: error: package org.apache.commons.logging does not exist
    import org.apache.commons.logging.LogFactory;
                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:35: error: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.FSDataOutputStream;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:36: error: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.FileSystem;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:37: error: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.FileUtil;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:38: error: package org.apache.hadoop.fs does not exist
    import org.apache.hadoop.fs.Path;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:39: error: package org.apache.hadoop.fs.permission does not exist
    import org.apache.hadoop.fs.permission.FsPermission;
                                          ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:40: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.FloatWritable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:41: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.NullWritable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:42: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.Writable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:43: error: package org.apache.hadoop.io does not exist
    import org.apache.hadoop.io.WritableComparable;
                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:45: error: package org.apache.hadoop.conf does not exist
    import org.apache.hadoop.conf.Configuration;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:55: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskInputOutputContext;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:56: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskAttemptID;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:57: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.TaskID;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:58: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.MRJobConfig;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:59: error: package org.apache.hadoop.mapreduce does not exist
    import org.apache.hadoop.mapreduce.OutputCommitter;
                                      ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:60: error: package org.apache.hadoop.mapreduce.filecache does not exist
    import org.apache.hadoop.mapreduce.filecache.DistributedCache;
                                                ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:61: error: package org.apache.hadoop.mapreduce.security does not exist
    import org.apache.hadoop.mapreduce.security.SecureShuffleUtils;
                                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:62: error: package org.apache.hadoop.mapreduce.security does not exist
    import org.apache.hadoop.mapreduce.security.TokenCache;
                                               ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:63: error: package org.apache.hadoop.mapreduce.security.token does not exist
    import org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier;
                                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:64: error: package org.apache.hadoop.mapreduce.security.token does not exist
    import org.apache.hadoop.mapreduce.security.token.JobTokenSecretManager;
                                                     ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:65: error: package org.apache.hadoop.mapreduce.lib.output does not exist
    import org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter;
                                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:66: error: package org.apache.hadoop.security.token does not exist
    import org.apache.hadoop.security.token.Token;
                                           ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:67: error: package org.apache.hadoop.util does not exist
    import org.apache.hadoop.util.ReflectionUtils;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:68: error: package org.apache.hadoop.util does not exist
    import org.apache.hadoop.util.StringUtils;
                                 ^
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:74: error: cannot find symbol
    class Application<K1 extends Writable, V1 extends Writable,
                                 ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:74: error: cannot find symbol
    class Application<K1 extends Writable, V1 extends Writable,
                                                      ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:75: error: cannot find symbol
                      K2 extends WritableComparable, V2 extends Writable> {
                                 ^
      symbol: class WritableComparable
    src/it/crs4/pydoop/mapreduce/pipes/Application.java:75: error: cannot find symbol
                      K2 extends WritableComparable, V2 extends Writable> {
                                                                ^
      symbol: class Writable
    src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java:49: error: package Reducer does not exist
        public void setup(Reducer.Context context) {
                                 ^
    100 errors
    error: Error compiling java component.  Command: javac -d 'build/temp.macosx-10.6-intel-3.6/pipes' src/it/crs4/pydoop/mapreduce/pipes/DownwardProtocol.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordWriter.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueOutputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PipesReducer.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordWriterBase.java src/it/crs4/pydoop/mapreduce/pipes/Application.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueRecordReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroRecordReaderBase.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueInputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PipesNonJavaOutputFormat.java src/it/crs4/pydoop/mapreduce/pipes/TaskLogAppender.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputFormatBase.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputKeyValueBridge.java src/it/crs4/pydoop/mapreduce/pipes/PipesMapper.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputValueBridge.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyWriter.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputBridgeBase.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyValueWriter.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeReaderBase.java src/it/crs4/pydoop/mapreduce/pipes/Submitter.java src/it/crs4/pydoop/mapreduce/pipes/BinaryProtocol.java src/it/crs4/pydoop/mapreduce/pipes/TaskLog.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyOutputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyRecordWriter.java src/it/crs4/pydoop/mapreduce/pipes/UpwardProtocol.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyRecordReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeWriterBase.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyInputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PipesPartitioner.java src/it/crs4/pydoop/mapreduce/pipes/OpaqueSplit.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeKeyValueReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputKeyBridge.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroInputBridgeBase.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputKeyBridge.java src/it/crs4/pydoop/mapreduce/pipes/OutputHandler.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeValueReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueRecordReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputValueBridge.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueOutputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroValueRecordWriter.java src/it/crs4/pydoop/mapreduce/pipes/DummyRecordReader.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroOutputKeyValueBridge.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroBridgeValueWriter.java src/it/crs4/pydoop/mapreduce/pipes/PipesNonJavaInputFormat.java src/it/crs4/pydoop/mapreduce/pipes/PydoopAvroKeyValueInputFormat.java src/it/crs4/pydoop/NoSeparatorTextOutputFormat.java

    ----------------------------------------
Command "/Library/Frameworks/Python.framework/Versions/3.6/bin/python3.6 -u -c "import setuptools, tokenize;__file__='/private/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pip-install-ymm1qorf/pydoop/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /private/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pip-record-tkpiirug/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /private/var/folders/tb/k3y2805d0t19jy3t9hf8670r0000gn/T/pip-install-ymm1qorf/pydoop/
simleo commented 5 years ago

Hi, sorry for the late answer. Unfortunately we don't have the resources to provide explicit support for the Mac at the moment. If you (or anyone else reading this) manage to make it work, we are interested in any feedback. Keeping this open for now.

simleo commented 5 years ago

Hi again,

I finally got my hands on a Mac. I think the problem is in the HADOOP_HOME setting, which should be /usr/local/Cellar/hadoop/3.1.1 (no libexec bit). Actually, I've just installed without setting HADOOP_HOME at all and let Pydoop auto detect it. At this point you can install version 2.0a4 (pip install --pre pydoop) which does not even need JAVA_HOME, so you can install from scratch with:

  1. Install JDK 8
  2. Install Xcode
  3. Run the following in a terminal
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
brew install hadoop
curl -O https://bootstrap.pypa.io/get-pip.py
python get-pip.py --user
export PATH="/Users/${USER}/Library/Python/2.7/bin:${PATH}"
pip install --user virtualenv
virtualenv venv
source venv/bin/activate
pip install --pre pydoop