Closed saghul closed 11 months ago
Doh, ninja segfaults when building llama.cpp:
ninja
#16 175.2 Running command Building wheel for llama-cpp-python (pyproject.toml) #16 175.9 *** scikit-build-core 0.6.1 using CMake 3.27.7 (wheel) #16 176.0 *** Configuring CMake... #16 176.1 2023-11-13 12:52:32,691 - scikit_build_core - WARNING - libdir/ldlibrary: /usr/lib/x86_64-linux-gnu/libpython3.11.so is not a real file! #16 176.1 2023-11-13 12:52:32,692 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/usr/lib/x86_64-linux-gnu, ldlibrary=libpython3.11.so, multiarch=x86_64-linux-gnu, masd=x86_64-linux-gnu #16 176.1 loading initial cache file /tmp/tmp4f8ih2pw/build/CMakeInit.txt #16 176.9 -- The C compiler identification is GNU 11.4.0 #16 177.6 -- The CXX compiler identification is GNU 11.4.0 #16 177.8 -- Detecting C compiler ABI info #16 178.6 -- Detecting C compiler ABI info - done #16 178.7 -- Check for working C compiler: /usr/bin/cc - skipped #16 178.7 -- Detecting C compile features #16 178.7 -- Detecting C compile features - done #16 178.9 -- Detecting CXX compiler ABI info #16 179.1 CMake Error: #16 179.1 Running #16 179.1 #16 179.1 '/tmp/pip-build-env-8kk1nsdl/normal/lib/python3.11/site-packages/ninja/data/bin/ninja' '-C' '/tmp/tmp4f8ih2pw/build/CMakeFiles/CMakeScratch/TryCompile-XE9GkA' '-t' 'restat' 'build.ninja' #16 179.1 #16 179.1 failed with: #16 179.1 #16 179.1 Segmentation fault #16 179.1 #16 179.1 #16 179.1 CMake Error at /tmp/pip-build-env-8kk1nsdl/normal/lib/python3.11/site-packages/cmake/data/share/cmake-3.27/Modules/CMakeDetermineCompilerABI.cmake:57 (try_compile): #16 179.1 Failed to generate test project build system. #16 179.1 Call Stack (most recent call first): #16 179.1 /tmp/pip-build-env-8kk1nsdl/normal/lib/python3.11/site-packages/cmake/data/share/cmake-3.27/Modules/CMakeTestCXXCompiler.cmake:26 (CMAKE_DETERMINE_COMPILER_ABI) #16 179.1 CMakeLists.txt:3 (project) #16 179.1 #16 179.1 #16 179.1 -- Configuring incomplete, errors occurred! #16 179.1 #16 179.1 *** CMake configuration failed #16 179.2 error: subprocess-exited-with-error #16 179.2 #16 179.2 × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. #16 179.2 │ exit code: 1 #16 179.2 ╰─> See above for output. #16 179.2 #16 179.2 note: This error originates from a subprocess, and is likely not a problem with pip. #16 179.2 full command: /app/.venv/bin/python3.11 /app/.venv/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmp5i2442br #16 179.2 cwd: /tmp/pip-install-yu70ftdl/llama-cpp-python_dc2c6a4c54e74373b67e863a86da6932 #16 179.2 ERROR: Failed building wheel for llama-cpp-python #16 179.2 ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
I'll leave this open and let's try again sometime in the future.
Doh,
ninja
segfaults when building llama.cpp:I'll leave this open and let's try again sometime in the future.