pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
82.55k stars 22.21k forks source link

Specific Aten object causes pytorch build to hang indefinitely #43054

Closed bvrockwell closed 4 years ago

bvrockwell commented 4 years ago

Hi,

I am building Pytorch 1.7 from source on a raspberry pi zero (raspbian-lite: kernel version 4.19) and the build is hanging at the same spot every single time indefinitely. Here are my steps:

sudo apt install libopenblas-dev libblas-dev m4 cmake cython python3-dev python3-yaml python3-setuptools python3-wheel python3-pillow python3-numpy git

git clone --recursive https://github.com/pytorch/pytorch
cd pytorch

export USE_CUDA=0
export USE_CUDNN=0
export BUILD_TEST=0
export USE_MKLDNN=0
export USE_DISTRIBUTED=0
export USE_NNPACK=0

(I have tried not setting these as well and I still get stuck at the same spot so I don't think it's this.I have also tried to run python3 setup.py clean prior and still no luck)

It always hangs indefinitely (12+ hours) at this line Building CXX object caffe2/CMakeFiles/torch_cpu.dir/__/aten/src/ATen/Functions.cpp.o

Does anyone have an idea what could be causing this and what I could do to resolve?

python3 setup.py install --verbose

Building wheel torch-1.7.0a0+a414bd6 -- Building version 1.7.0a0+a414bd6 cmake -DBUILD_PYTHON=True -DBUILD_TEST=False -DCMAKE_BUILDTYPE=Release -DCMAKE INSTALL_PREFIX=/home/pi/pytorch/torch -DCMAKE_PREFIX_PATH=/usr/lib/python3/dist- packages -DNUMPY_INCLUDE_DIR=/usr/lib/python3/dist-packages/numpy/core/include - DPYTHON_EXECUTABLE=/usr/bin/python3 -DPYTHON_INCLUDE_DIR=/usr/include/python3.7m -DPYTHON_LIBRARY=/usr/lib/libpython3.7m.so.1.0 -DTORCH_BUILD_VERSION=1.7.0a0+a4 14bd6 -DUSE_CUDA=0 -DUSE_CUDNN=0 -DUSE_DISTRIBUTED=0 -DUSE_MKLDNN=0 -DUSE_NNPACK =0 -DUSE_NUMPY=True /home/pi/pytorch -- The CXX compiler identification is GNU 8.3.0 -- The C compiler identification is GNU 8.3.0 -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- Not forcing any particular BLAS to be found -- Performing Test COMPILER_WORKS -- Performing Test COMPILER_WORKS - Success -- Performing Test SUPPORT_GLIBCXX_USE_C99 -- Performing Test SUPPORT_GLIBCXX_USE_C99 - Success -- Performing Test CAFFE2_EXCEPTION_PTR_SUPPORTED -- Performing Test CAFFE2_EXCEPTION_PTR_SUPPORTED - Success -- std::exception_ptr is supported. -- Performing Test CAFFE2_NEED_TO_TURN_OFF_DEPRECATION_WARNING -- Performing Test CAFFE2_NEED_TO_TURN_OFF_DEPRECATION_WARNING - Failed -- Turning off deprecation warning due to glog. -- Performing Test CAFFE2_COMPILER_SUPPORTS_AVX2_EXTENSIONS -- Performing Test CAFFE2_COMPILER_SUPPORTS_AVX2_EXTENSIONS - Failed -- Performing Test CAFFE2_COMPILER_SUPPORTS_AVX512_EXTENSIONS -- Performing Test CAFFE2_COMPILER_SUPPORTS_AVX512_EXTENSIONS - Failed -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_VISIBILITY - Success -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY -- Performing Test COMPILER_SUPPORTS_HIDDEN_INLINE_VISIBILITY - Success -- Performing Test COMPILER_SUPPORTS_RDYNAMIC -- Performing Test COMPILER_SUPPORTS_RDYNAMIC - Success -- Building using own protobuf under third_party per request. -- Use custom protobuf build.

-- 3.11.4.0 -- Looking for pthread.h -- Looking for pthread.h - found -- Looking for pthread_create -- Looking for pthread_create - not found -- Check if compiler accepts -pthread -- Check if compiler accepts -pthread - yes -- Found Threads: TRUE -- Performing Test protobuf_HAVE_BUILTIN_ATOMICS -- Performing Test protobuf_HAVE_BUILTIN_ATOMICS - Failed -- Caffe2 protobuf include directory: $<BUILD_INTERFACE:/home/pi/pytorch/third_p arty/protobuf/src>$ -- Trying to find preferred BLAS backend of choice: MKL -- MKL_THREADING = OMP -- Looking for sys/types.h -- Looking for sys/types.h - found -- Looking for stdint.h -- Looking for stdint.h - found -- Looking for stddef.h -- Looking for stddef.h - found -- Check size of void -- Check size of void - done -- MKL_THREADING = OMP CMake Warning at cmake/Dependencies.cmake:148 (message): MKL could not be found. Defaulting to Eigen Call Stack (most recent call first): CMakeLists.txt:472 (include)

CMake Warning at cmake/Dependencies.cmake:172 (message): Preferred BLAS (MKL) cannot be found, now searching for a general BLAS library Call Stack (most recent call first): CMakeLists.txt:472 (include)

-- MKL_THREADING = OMP -- Checking for [mkl_intel - mkl_gnu_thread - mkl_core - gomp - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core - gomp - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_gnu_thread - mkl_core - gomp - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_gf - mkl_intel_thread - mkl_core - gomp - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_gnu_thread - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_gnu_thread - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_gf - mkl_intel_thread - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_gnu_thread - mkl_core - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_intel - mkl_intel_thread - mkl_core - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_gnu_thread - mkl_core - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_gf - mkl_intel_thread - mkl_core - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_sequential - mkl_core - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_sequential - mkl_core - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_core - gomp - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_core - gomp - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_core - iomp5 - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl_intel - mkl_core - pthread - m - dl] -- Library mkl_intel: not found -- Checking for [mkl_gf - mkl_core - pthread - m - dl] -- Library mkl_gf: not found -- Checking for [mkl - guide - pthread - m] -- Library mkl: not found -- MKL library not found -- Checking for [Accelerate] -- Library Accelerate: BLAS_Accelerate_LIBRARY-NOTFOUND -- Checking for [vecLib] -- Library vecLib: BLAS_vecLibLIBRARY-NOTFOUND -- Checking for [openblas] -- Library openblas: /usr/lib/arm-linux-gnueabihf/libopenblas.so -- Looking for sgemm -- Looking for sgemm_ - found -- Performing Test BLAS_F2C_DOUBLE_WORKS -- Performing Test BLAS_F2C_DOUBLE_WORKS - Failed -- Performing Test BLAS_F2C_FLOAT_WORKS -- Performing Test BLAS_F2C_FLOAT_WORKS - Success -- Performing Test BLAS_USE_CBLAS_DOT -- Performing Test BLAS_USE_CBLAS_DOT - Success -- Found a library with BLAS API (open). -- The ASM compiler identification is GNU -- Found assembler: /usr/bin/cc CMake Warning at cmake/Dependencies.cmake:659 (message): A compiler with AVX512 support is required for FBGEMM. Not compiling with FBGEMM. Turn this warning off by USE_FBGEMM=OFF. Call Stack (most recent call first): CMakeLists.txt:472 (include)

CMake Warning at cmake/Dependencies.cmake:666 (message): x64 operating system is required for FBGEMM. Not compiling with FBGEMM. Turn this warning off by USE_FBGEMM=OFF. Call Stack (most recent call first): CMakeLists.txt:472 (include)

CMake Warning at cmake/Dependencies.cmake:698 (message): Turning USE_FAKELOWP off as it depends on USE_FBGEMM. Call Stack (most recent call first): CMakeLists.txt:472 (include)

-- Could NOT find Numa (missing: Numa_INCLUDE_DIR Numa_LIBRARIES) CMake Warning at cmake/Dependencies.cmake:748 (message): Not compiling with NUMA. Suppress this warning with -DUSE_NUMA=OFF Call Stack (most recent call first): CMakeLists.txt:472 (include)

-- Using third party subdirectory Eigen. -- Found PythonInterp: /usr/bin/python3 (found suitable version "3.7.3", minimum required is "3.0") -- Found PythonLibs: /usr/lib/libpython3.7m.so.1.0 (found suitable version "3.7.3", minimum required is "3.0") -- Could NOT find pybind11 (missing: pybind11_DIR) -- Could NOT find pybind11 (missing: pybind11_INCLUDE_DIR) -- Using third_party/pybind11. -- pybind11 include dirs: /home/pi/pytorch/cmake/../third_party/pybind11/include -- Adding OpenMP CXX_FLAGS: -fopenmp -- Will link against OpenMP libraries: /usr/lib/gcc/arm-linux-gnueabihf/8/libgomp.so;/usr/lib/arm-linux-gnueabihf/libpthread.so CMake Warning at cmake/Dependencies.cmake:1225 (message): Not using CUDA/ROCM, so disabling USE_NCCL. Suppress this warning with -DUSE_NCCL=OFF. Call Stack (most recent call first): CMakeLists.txt:472 (include)

CMake Warning at cmake/Dependencies.cmake:1346 (message): Metal is only used in ios builds. Call Stack (most recent call first): CMakeLists.txt:472 (include)

Generated: /home/pi/pytorch/build/third_party/onnx/onnx/onnx_onnx_torch-ml.proto Generated: /home/pi/pytorch/build/third_party/onnx/onnx/onnx-operators_onnx_torch-ml.proto

-- **** Summary **** -- CMake version : 3.13.4 -- CMake command : /usr/bin/cmake -- System : Linux -- C++ compiler : /usr/bin/c++ -- C++ compiler version : 8.3.0 -- CXX flags : -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -Wnon-virtual-dtor -- Build type : Release -- Compile definitions : ONNX_ML=1;ONNXIFI_ENABLE_EXT=1 -- CMAKE_PREFIX_PATH : /usr/lib/python3/dist-packages -- CMAKE_INSTALL_PREFIX : /home/pi/pytorch/torch -- CMAKE_MODULE_PATH : /home/pi/pytorch/cmake/Modules

-- ONNX version : 1.7.0 -- ONNX NAMESPACE : onnx_torch -- ONNX_BUILD_TESTS : OFF -- ONNX_BUILD_BENCHMARKS : OFF -- ONNX_USE_LITE_PROTO : OFF -- ONNXIFI_DUMMY_BACKEND : OFF -- ONNXIFI_ENABLE_EXT : OFF

-- Protobuf compiler : -- Protobuf includes : -- Protobuf libraries : -- BUILD_ONNX_PYTHON : OFF

-- **** Summary **** -- CMake version : 3.13.4 -- CMake command : /usr/bin/cmake -- System : Linux -- C++ compiler : /usr/bin/c++ -- C++ compiler version : 8.3.0 -- CXX flags : -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -Wnon-virtual-dtor -- Build type : Release -- Compile definitions : ONNX_ML=1;ONNXIFI_ENABLE_EXT=1 -- CMAKE_PREFIX_PATH : /usr/lib/python3/dist-packages -- CMAKE_INSTALL_PREFIX : /home/pi/pytorch/torch -- CMAKE_MODULE_PATH : /home/pi/pytorch/cmake/Modules

-- ONNX version : 1.4.1 -- ONNX NAMESPACE : onnx_torch -- ONNX_BUILD_TESTS : OFF -- ONNX_BUILD_BENCHMARKS : OFF -- ONNX_USE_LITE_PROTO : OFF -- ONNXIFI_DUMMY_BACKEND : OFF

-- Protobuf compiler : -- Protobuf includes : -- Protobuf libraries : -- BUILD_ONNX_PYTHON : OFF -- Could not find CUDA with FP16 support, compiling without torch.CudaHalfTensor -- Adding -DNDEBUG to compile flags -- MAGMA not found. Compiling without MAGMA support -- Could not find hardware support for NEON on this machine. -- No OMAP3 processor on this machine. -- No OMAP4 processor on this machine. -- Looking for cpuid.h -- Looking for cpuid.h - not found -- Performing Test NO_GCC_EBX_FPIC_BUG -- Performing Test NO_GCC_EBX_FPIC_BUG - Failed -- Performing Test C_HAS_AVX_1 -- Performing Test C_HAS_AVX_1 - Failed -- Performing Test C_HAS_AVX_2 -- Performing Test C_HAS_AVX_2 - Failed -- Performing Test C_HAS_AVX_3 -- Performing Test C_HAS_AVX_3 - Failed -- Performing Test C_HAS_AVX2_1 -- Performing Test C_HAS_AVX2_1 - Failed -- Performing Test C_HAS_AVX2_2 -- Performing Test C_HAS_AVX2_2 - Failed -- Performing Test C_HAS_AVX2_3 -- Performing Test C_HAS_AVX2_3 - Failed -- Performing Test CXX_HAS_AVX_1 -- Performing Test CXX_HAS_AVX_1 - Failed -- Performing Test CXX_HAS_AVX_2 -- Performing Test CXX_HAS_AVX_2 - Failed -- Performing Test CXX_HAS_AVX_3 -- Performing Test CXX_HAS_AVX_3 - Failed -- Performing Test CXX_HAS_AVX2_1 -- Performing Test CXX_HAS_AVX2_1 - Failed -- Performing Test CXX_HAS_AVX2_2 -- Performing Test CXX_HAS_AVX2_2 - Failed -- Performing Test CXX_HAS_AVX2_3 -- Performing Test CXX_HAS_AVX23 - Failed -- Looking for cheev -- Looking for cheev_ - found -- Found a library with LAPACK API (open). disabling CUDA because NOT USE_CUDA is set -- USE_CUDNN is set to 0. Compiling without cuDNN support disabling ROCM because NOT USE_ROCM is set -- MIOpen not found. Compiling without MIOpen support disabling MKLDNN because USE_MKLDNN is not set -- Looking for clock_gettime in rt -- Looking for clock_gettime in rt - found -- Looking for mmap -- Looking for mmap - found -- Looking for shm_open -- Looking for shm_open - found -- Looking for shm_unlink -- Looking for shm_unlink - found -- Looking for malloc_usable_size -- Looking for malloc_usable_size - found -- Performing Test C_HAS_THREAD -- Performing Test C_HAS_THREAD - Success -- Version: 6.2.0 -- Build type: Release -- CXX_STANDARD: 14 -- Performing Test has_std_14_flag -- Performing Test has_std_14_flag - Success -- Performing Test has_std_1y_flag -- Performing Test has_std_1y_flag - Success -- Performing Test SUPPORTS_USER_DEFINED_LITERALS -- Performing Test SUPPORTS_USER_DEFINED_LITERALS - Success -- Performing Test FMT_HAS_VARIANT -- Performing Test FMT_HAS_VARIANT - Success -- Required features: cxx_variadic_templates -- Looking for strtod_l -- Looking for strtod_l - not found -- GCC 8.3.0: Adding gcc and gcc_s libs to link line -- Performing Test HAS_WERROR_FORMAT -- Performing Test HAS_WERROR_FORMAT - Success -- Looking for backtrace -- Looking for backtrace - found -- backtrace facility detected in default set of libraries -- Found Backtrace: /usr/include -- don't use NUMA -- Using ATen parallel backend: OMP disabling CUDA because USE_CUDA is set false -- Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the system variable OPENSSL_ROOT_DIR (missing: OPENSSL_CRYPTO_LIBRARY OPENSSL_INCLUDE_DIR) -- Check size of long double -- Check size of long double - done -- Performing Test COMPILER_SUPPORTS_FLOAT128 -- Performing Test COMPILER_SUPPORTS_FLOAT128 - Failed -- Found OpenMP_C: -fopenmp (found version "4.5") -- Found OpenMP_CXX: -fopenmp (found version "4.5") -- Found OpenMP: TRUE (found version "4.5") -- Performing Test COMPILER_SUPPORTS_OPENMP -- Performing Test COMPILER_SUPPORTS_OPENMP - Success -- Performing Test COMPILER_SUPPORTS_WEAK_ALIASES -- Performing Test COMPILER_SUPPORTS_WEAK_ALIASES - Success -- Performing Test COMPILER_SUPPORTS_BUILTIN_MATH -- Performing Test COMPILER_SUPPORTS_BUILTIN_MATH - Success -- Performing Test COMPILER_SUPPORTS_SYS_GETRANDOM -- Performing Test COMPILER_SUPPORTS_SYS_GETRANDOM - Success -- Configuring build for SLEEF-v3.4.0 Target system: Linux-4.19.118+ Target processor: armv6l Host system: Linux-4.19.118+ Host processor: armv6l Detected C compiler: GNU @ /usr/bin/cc -- Using option -Wall -Wno-unused -Wno-attributes -Wno-unused-result -Wno-psabi -ffp-contract=off -fno-math-errno -fno-trapping-math to compile libsleef -- Building shared libs : OFF -- MPFR : LIB_MPFR-NOTFOUND -- GMP : LIBGMP-NOTFOUND -- RT : /usr/lib/arm-linux-gnueabihf/librt.so -- FFTW3 : LIBFFTW3-NOTFOUND -- OPENSSL : -- SDE : SDE_COMMAND-NOTFOUND -- RUNNING_ON_TRAVIS : 0 -- COMPILER_SUPPORTS_OPENMP : 1 AT_INSTALL_INCLUDE_DIR include/ATen/core core header install: /home/pi/pytorch/build/aten/src/ATen/core/TensorBody.h -- NCCL operators skipped due to no CUDA support -- Excluding FakeLowP operators -- Excluding ideep operators as we are not using ideep -- Excluding image processing operators due to no opencv -- Excluding video processing operators due to no opencv -- MPI operators skipped due to no MPI support -- Include Observer library ^[[B-- /usr/bin/c++ /home/pi/pytorch/torch/abi-check.cpp -o /home/pi/pytorch/build/abi-check -- Determined _GLIBCXX_USE_CXX11_ABI=1 -- pytorch is compiling with OpenMP. OpenMP CXX_FLAGS: -fopenmp. OpenMP libraries: /usr/lib/gcc/arm-linux-gnueabihf/8/libgomp.so;/usr/lib/arm-linux-gnueabihf/libpthread.so. -- Caffe2 is compiling with OpenMP. OpenMP CXX_FLAGS: -fopenmp. OpenMP libraries: /usr/lib/gcc/arm-linux-gnueabihf/8/libgomp.so;/usr/lib/arm-linux-gnueabihf/libpthread.so. -- Using lib/python3/dist-packages as python relative installation path CMake Warning at CMakeLists.txt:703 (message): Generated cmake files are only fully tested if one builds with system glog, gflags, and protobuf. Other settings may generate files that are not well tested.

-- -- **** Summary **** -- General: -- CMake version : 3.13.4 -- CMake command : /usr/bin/cmake -- System : Linux -- C++ compiler : /usr/bin/c++ -- C++ compiler id : GNU -- C++ compiler version : 8.3.0 -- BLAS : MKL -- CXX flags : -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DUSE_VULKAN_WRAPPER -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow -- Build type : Release -- Compile definitions : ONNX_ML=1;ONNXIFI_ENABLE_EXT=1;ONNX_NAMESPACE=onnx_torch;HAVE_MMAP=1;_FILE_OFFSET_BITS=64;HAVE_SHM_OPEN=1;HAVE_SHM_UNLINK=1;HAVE_MALLOC_USABLE_SIZE=1;USE_EXTERNAL_MZCRC;MINIZ_DISABLE_ZIP_READER_CRC32_CHECKS -- CMAKE_PREFIX_PATH : /usr/lib/python3/dist-packages -- CMAKE_INSTALL_PREFIX : /home/pi/pytorch/torch

-- TORCH_VERSION : 1.7.0 -- CAFFE2_VERSION : 1.7.0 -- BUILD_CAFFE2_MOBILE : OFF -- USE_STATIC_DISPATCH : OFF -- BUILD_BINARY : OFF -- BUILD_CUSTOM_PROTOBUF : ON -- Link local protobuf : ON -- BUILD_DOCS : OFF -- BUILD_PYTHON : True -- Python version : 3.7.3 -- Python executable : /usr/bin/python3 -- Pythonlibs version : 3.7.3 -- Python library : /usr/lib/libpython3.7m.so.1.0 -- Python includes : /usr/include/python3.7m -- Python site-packages: lib/python3/dist-packages -- BUILD_CAFFE2_OPS : ON -- BUILD_SHARED_LIBS : ON -- BUILD_TEST : False -- BUILD_JNI : OFF -- INTERN_BUILD_MOBILE : -- CLANG_CODE_COVERAGE : OFF -- USE_ASAN : OFF -- USE_CUDA : 0 -- USE_ROCM : OFF -- USE_EIGEN_FOR_BLAS : ON -- USE_FBGEMM : OFF -- USE_FAKELOWP : OFF -- USE_FFMPEG : OFF -- USE_GFLAGS : OFF -- USE_GLOG : OFF -- USE_LEVELDB : OFF -- USE_LITE_PROTO : OFF -- USE_LMDB : OFF -- USE_METAL : OFF -- USE_MKL : OFF -- USE_MKLDNN : OFF -- USE_NCCL : OFF -- USE_NNPACK : 0 -- USE_NUMPY : ON -- USE_OBSERVERS : ON -- USE_OPENCL : OFF -- USE_OPENCV : OFF -- USE_OPENMP : ON -- USE_TBB : OFF -- USE_VULKAN : OFF -- USE_PROF : OFF -- USE_QNNPACK : ON -- USE_PYTORCH_QNNPACK : ON -- USE_REDIS : OFF -- USE_ROCKSDB : OFF -- USE_ZMQ : OFF -- USE_DISTRIBUTED : 0 -- Public Dependencies : Threads::Threads -- Private Dependencies : pthreadpool;cpuinfo;qnnpack;pytorch_qnnpack;XNNPACK;fp16;aten_op_header_gen;foxi_loader;rt;fmt::fmt-header-only;gcc_s;gcc;dl -- Configuring done -- Generating done -- Build files have been written to: /home/pi/pytorch/build cmake --build . --target install --config Release -- -j 1 Scanning dependencies of target libprotobuf [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/any_lite.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/arena.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/extension_set.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/generated_enum_util.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/generated_message_table_driven_lite.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir/__/src/google/protobuf/generated_message_util.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/implicit_weak_message.cc.o [ 0%] Building CXX object third_party/protobuf/cmake/CMakeFiles/libprotobuf.dir//src/google/protobuf/io/coded_stream.cc.o .... [ 39%] Built target sleef [ 39%] Generating ../../torch/csrc/autograd/generated/Functions.cpp, ../../torch/csrc/jit/generated/generated_unboxing_wrappers_0.cpp, ../../torch/csrc/jit/generated/generated_unboxing_wrappers_1.cpp, ../../torch/csrc/jit/generated/generated_unboxing_wrappers_2.cpp, ../../torch/csrc/autograd/generated/VariableType_0.cpp, ../../torch/csrc/autograd/generated/VariableType_1.cpp, ../../torch/csrc/autograd/generated/VariableType_2.cpp, ../../torch/csrc/autograd/generated/VariableType_3.cpp, ../../torch/csrc/autograd/generated/VariableType_4.cpp, ../../torch/csrc/autograd/generated/TraceType_0.cpp, ../../torch/csrc/autograd/generated/TraceType_1.cpp, ../../torch/csrc/autograd/generated/TraceType_2.cpp, ../../torch/csrc/autograd/generated/TraceType_3.cpp, ../../torch/csrc/autograd/generated/TraceType_4.cpp, ../../torch/csrc/autograd/generated/Functions.h, ../../torch/csrc/autograd/generated/variable_factories.h, ../../torch/csrc/autograd/generated/VariableType.h, ../../torch/csrc/autograd/generated/python_functions.cpp, ../../torch/csrc/autograd/generated/python_variable_methods.cpp, ../../torch/csrc/autograd/generated/python_torch_functions.cpp, ../../torch/csrc/autograd/generated/python_nn_functions.cpp, ../../torch/csrc/autograd/generated/python_fft_functions.cpp, ../../torch/csrc/autograd/generated/python_linalg_functions.cpp, ../../torch/csrc/autograd/generated/python_functions.h, ../../torch/testing/_internal/generated/annotated_fn_args.py Writing torch/csrc/autograd/generated/python_functions.h Writing torch/csrc/autograd/generated/python_functions.cpp Writing torch/csrc/autograd/generated/python_variable_methods.cpp Writing torch/csrc/autograd/generated/python_torch_functions.cpp Writing torch/csrc/autograd/generated/python_nn_functions.cpp Writing torch/csrc/autograd/generated/python_fft_functions.cpp Writing torch/csrc/autograd/generated/python_linalg_functions.cpp Writing torch/csrc/autograd/generated/VariableType.h Writing torch/csrc/autograd/generated/VariableType_0.cpp Writing torch/csrc/autograd/generated/TraceType_0.cpp Writing torch/csrc/autograd/generated/VariableType_1.cpp Writing torch/csrc/autograd/generated/TraceType_1.cpp Writing torch/csrc/autograd/generated/VariableType_2.cpp Writing torch/csrc/autograd/generated/TraceType_2.cpp Writing torch/csrc/autograd/generated/VariableType_3.cpp Writing torch/csrc/autograd/generated/TraceType_3.cpp Writing torch/csrc/autograd/generated/VariableType_4.cpp Writing torch/csrc/autograd/generated/TraceType_4.cpp Writing torch/csrc/autograd/generated/VariableTypeEverything.cpp Writing torch/csrc/autograd/generated/TraceTypeEverything.cpp Writing torch/csrc/autograd/generated/RegistrationDeclarations.h Writing torch/csrc/autograd/generated/Functions.h Writing torch/csrc/autograd/generated/Functions.cpp Writing torch/csrc/autograd/generated/variable_factories.h Writing torch/csrc/jit/generated/generated_unboxing_wrappers_0.cpp Writing torch/csrc/jit/generated/generated_unboxing_wrappers_1.cpp Writing torch/csrc/jit/generated/generated_unboxing_wrappers_2.cpp Writing torch/testing/_internal/generated/annotated_fn_args.py Scanning dependencies of target torch_cpu ..... [ 48%] Building CXX object caffe2/CMakeFiles/torch_cpu.dir/__/aten/src/ATen/CPUType.cpp.o [ 48%] Building CXX object caffe2/CMakeFiles/torch_cpu.dir//aten/src/ATen/Functions.cpp.o Any help would be greatly appreciated!

izdeby commented 4 years ago

Hi, @bvrockwell, please, use our forums for questions.

JoabeSilva commented 3 years ago

Hi,

I have the same problem. @izdeby, is ther any solution on forums? I couldn't find it there either.