PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.79k stars 2.2k forks source link

ERROR: Failed building wheel for llama-cpp-python #78

Open naorsabag opened 1 year ago

naorsabag commented 1 year ago

Describe the bug:

Running 'pip install -r requirements.txt resulted' in error regarding building wheel for llama-cpp-python

Reproduction:

pip install -r requirements.txt ERROR: Failed building wheel for llama-cpp-python

System Info:

Logs:

Building wheels for collected packages: llama-cpp-python
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [130 lines of output]

      --------------------------------------------------------------------------------
      -- Trying 'Ninja' generator
      --------------------------------
      ---------------------------
      ----------------------
      -----------------
      ------------
      -------
      --
      Not searching for unused variables given on the command line.
      -- The C compiler identification is GNU 7.3.1
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /usr/bin/cc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- The CXX compiler identification is GNU 7.3.1
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Configuring done (0.3s)
      -- Generating done (0.0s)
      -- Build files have been written to: /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/_cmake_test_compile/build
      --
      -------
      ------------
      -----------------
      ----------------------
      ---------------------------
      --------------------------------
      -- Trying 'Ninja' generator - success
      --------------------------------------------------------------------------------

      Configuring Project
        Working directory:
          /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/_skbuild/linux-x86_64-3.10/cmake-build
        Command:
          /tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6 -G Ninja -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja --no-warn-unused-cli -DCMAKE_INSTALL_PREFIX:PATH=/tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/_skbuild/linux-x86_64-3.10/cmake-install -DPYTHON_VERSION_STRING:STRING=3.10.8 -DSKBUILD:INTERNAL=TRUE -DCMAKE_MODULE_PATH:PATH=/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/skbuild/resources/cmake -DPYTHON_EXECUTABLE:PATH=/home/ec2-user/anaconda3/envs/python3/bin/python3.10 -DPYTHON_INCLUDE_DIR:PATH=/home/ec2-user/anaconda3/envs/python3/include/python3.10 -DPYTHON_LIBRARY:PATH=/home/ec2-user/anaconda3/envs/python3/lib/libpython3.10.so -DPython_EXECUTABLE:PATH=/home/ec2-user/anaconda3/envs/python3/bin/python3.10 -DPython_ROOT_DIR:PATH=/home/ec2-user/anaconda3/envs/python3 -DPython_FIND_REGISTRY:STRING=NEVER -DPython_INCLUDE_DIR:PATH=/home/ec2-user/anaconda3/envs/python3/include/python3.10 -DPython3_EXECUTABLE:PATH=/home/ec2-user/anaconda3/envs/python3/bin/python3.10 -DPython3_ROOT_DIR:PATH=/home/ec2-user/anaconda3/envs/python3 -DPython3_FIND_REGISTRY:STRING=NEVER -DPython3_INCLUDE_DIR:PATH=/home/ec2-user/anaconda3/envs/python3/include/python3.10 -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/ninja/data/bin/ninja -DCMAKE_BUILD_TYPE:STRING=Release

      Not searching for unused variables given on the command line.
      -- The C compiler identification is GNU 7.3.1
      -- The CXX compiler identification is GNU 7.3.1
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /usr/bin/cc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/c++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      CMake Warning (dev) in CMakeLists.txt:
        A logical block opening on the line

          /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/CMakeLists.txt:9 (if)

        closes on the line

          /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/CMakeLists.txt:31 (endif)

        with mis-matching arguments.
      This warning is for project developers.  Use -Wno-dev to suppress it.

      -- Configuring done (0.3s)
      -- Generating done (0.0s)
      -- Build files have been written to: /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/_skbuild/linux-x86_64-3.10/cmake-build
      [1/2] Generating /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/vendor/llama.cpp/libllama.so
      FAILED: /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/vendor/llama.cpp/libllama.so
      cd /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/vendor/llama.cpp && make libllama.so
      I llama.cpp build info:
      I UNAME_S:  Linux
      I UNAME_P:  x86_64
      I UNAME_M:  x86_64
      I CFLAGS:   -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native
      I CXXFLAGS: -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
      I LDFLAGS:
      I CC:       cc (GCC) 7.3.1 20180712 (Red Hat 7.3.1-15)
      I CXX:      g++ (GCC) 7.3.1 20180712 (Red Hat 7.3.1-15)

      g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native -c llama.cpp -o llama.o
      llama.cpp: In function ‘size_t llama_set_state_data(llama_context*, const uint8_t*)’:
      llama.cpp:2610:36: warning: cast from type ‘const uint8_t* {aka const unsigned char*}’ to type ‘void*’ casts away qualifiers [-Wcast-qual]
                   kin3d->data = (void *) in;
                                          ^~
      llama.cpp:2614:36: warning: cast from type ‘const uint8_t* {aka const unsigned char*}’ to type ‘void*’ casts away qualifiers [-Wcast-qual]
                   vin3d->data = (void *) in;
                                          ^~
      cc  -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native   -c ggml.c -o ggml.o
      ggml.c: In function ‘ggml_vec_dot_q4_2_q8_0’:
      ggml.c:3253:40: warning: implicit declaration of function ‘_mm256_set_m128’; did you mean ‘_mm256_set_epi8’? [-Wimplicit-function-declaration]
               const __m256 d = _mm256_mul_ps(_mm256_set_m128(d1, d0), _mm256_broadcast_ss(&y[i].d));
                                              ^~~~~~~~~~~~~~~
                                              _mm256_set_epi8
      ggml.c:3253:40: error: incompatible type for argument 1 of ‘_mm256_mul_ps’
      In file included from /usr/lib/gcc/x86_64-redhat-linux/7/include/immintrin.h:41:0,
                       from ggml.c:189:
      /usr/lib/gcc/x86_64-redhat-linux/7/include/avxintrin.h:317:1: note: expected ‘__m256 {aka __vector(8) float}’ but argument is of type ‘int’
       _mm256_mul_ps (__m256 __A, __m256 __B)
       ^~~~~~~~~~~~~
      ggml.c:3257:22: warning: implicit declaration of function ‘_mm256_set_m128i’; did you mean ‘_mm256_set_epi8’? [-Wimplicit-function-declaration]
               __m256i bx = _mm256_set_m128i(bx1, bx0);
                            ^~~~~~~~~~~~~~~~
                            _mm256_set_epi8
      ggml.c:3257:22: error: incompatible types when initializing type ‘__m256i {aka __vector(4) long long int}’ using type ‘int’
      make: *** [ggml.o] Error 1
      ninja: build stopped: subcommand failed.
      Traceback (most recent call last):
        File "/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/skbuild/setuptools_wrap.py", line 674, in setup
          cmkr.make(make_args, install_target=cmake_install_target, env=env)
        File "/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/skbuild/cmaker.py", line 697, in make
          self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
        File "/tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/skbuild/cmaker.py", line 742, in make_impl
          raise SKBuildError(msg)

      An error occurred while building with CMake.
        Command:
          /tmp/pip-build-env-jp9j4tj4/overlay/lib/python3.10/site-packages/cmake/data/bin/cmake --build . --target install --config Release --
        Install target:
          install
        Source directory:
          /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6
        Working directory:
          /tmp/pip-install-nq5sz34h/llama-cpp-python_5dc4be8401d844e39e0c230487042cf6/_skbuild/linux-x86_64-3.10/cmake-build
      Please check the install target is valid and see CMake's output for more information.

      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Otoliths commented 1 year ago

If you encounter an error while building a wheel during the pip install process, you may need to install a C++ compiler on your computer.

For Windows 10/11

To install a C++ compiler on Windows 10/11, follow these steps:

Install Visual Studio 2022. Make sure the following components are selected: Universal Windows Platform development C++ CMake tools for Windows

teleprint-me commented 1 year ago

Looks like you might be running a newer debian distro.

This should be reported upstream to the llama-cpp-python repo because that's what's failing to build.

It could be a depenency issue because that's usually when I see an issue like this on Linux. It's worth investigating.

naorsabag commented 1 year ago

I run the following as @Otoliths suggested and it solved the issue

conda install -c conda-forge c-compiler conda install -c "conda-forge/label/cf202003" c-compiler

DarkAbhi commented 1 year ago

I get the same error on Linux

@naorsabag solution didn't work for me.

Conda Version - conda 23.3.1 Python Version - Python 3.10.6

OS details:

           .-/+oossssoo+/-.               abhishek@ryzen 
        `:+ssssssssssssssssss+:`           -------------- 
      -+ssssssssssssssssssyyssss+-         OS: Ubuntu 22.04.2 LTS x86_64 
    .ossssssssssssssssssdMMMNysssso.       Host: MS-7A38 8.0 
   /ssssssssssshdmmNNmmyNMMMMhssssss/      Kernel: 5.19.0-43-generic 
  +ssssssssshmydMMMMMMMNddddyssssssss+     Uptime: 14 mins 
 /sssssssshNMMMyhhyyyyhmNMMMNhssssssss/    Packages: 1996 (dpkg), 24 (snap) 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Shell: zsh 5.8.1 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   Resolution: 1920x1080, 1920x1080 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   DE: GNOME 42.5 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   WM: Mutter 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   WM Theme: Flat-Remix-Teal-Dark 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Theme: Yaru-dark [GTK2/3] 
 /sssssssshNMMMyhhyyyyhdNMMMNhssssssss/    Icons: Yaru [GTK2/3] 
  +sssssssssdmydMMMMMMMMddddyssssssss+     Terminal: gnome-terminal 
   /ssssssssssshdmNNNNmyNMMMMhssssss/      CPU: AMD Ryzen 5 3600 (12) @ 3.600GH 
    .ossssssssssssssssssdMMMNysssso.       GPU: NVIDIA GeForce RTX 2060 SUPER 
      -+sssssssssssssssssyyyssss+-         Memory: 4433MiB / 40061MiB 
        `:+ssssssssssssssssss+:`

Error:

Building wheels for collected packages: llama-cpp-python, hnswlib
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [80 lines of output]

      --------------------------------------------------------------------------------
      -- Trying 'Ninja' generator
      --------------------------------
      ---------------------------
      ----------------------
      -----------------
      ------------
      -------
      --
      CMake Error at /tmp/pip-build-env-gq9lomr_/overlay/lib/python3.10/site-packages/cmake/data/share/cmake-3.26/Modules/CMakeDetermineCCompiler.cmake:49 (message):
        Could not find compiler set in environment variable CC:

        ccache gcc.
      Call Stack (most recent call first):
        CMakeLists.txt:3 (ENABLE_LANGUAGE)

      Not searching for unused variables given on the command line.

      CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
      -- Configuring incomplete, errors occurred!
      --
      -------
      ------------
      -----------------
      ----------------------
      ---------------------------
      --------------------------------
      -- Trying 'Ninja' generator - failure
      --------------------------------------------------------------------------------

      --------------------------------------------------------------------------------
      -- Trying 'Unix Makefiles' generator
      --------------------------------
      ---------------------------
      ----------------------
      -----------------
      ------------
      -------
      --
      CMake Error at /tmp/pip-build-env-gq9lomr_/overlay/lib/python3.10/site-packages/cmake/data/share/cmake-3.26/Modules/CMakeDetermineCCompiler.cmake:49 (message):
        Could not find compiler set in environment variable CC:

        ccache gcc.
      Call Stack (most recent call first):
        CMakeLists.txt:3 (ENABLE_LANGUAGE)

      Not searching for unused variables given on the command line.

      CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
      -- Configuring incomplete, errors occurred!
      --
      -------
      ------------
      -----------------
      ----------------------
      ---------------------------
      --------------------------------
      -- Trying 'Unix Makefiles' generator - failure
      --------------------------------------------------------------------------------

                      ********************************************************************************
                      scikit-build could not get a working generator for your system. Aborting build.

                      Building Linux wheels for Python 3.10 requires a compiler (e.g gcc).
      But scikit-build does *NOT* know how to install it on ubuntu

      To build compliant wheels, consider using the manylinux system described in PEP-513.
      Get it with "dockcross/manylinux-x64" docker image:

        https://github.com/dockcross/dockcross#readme

      For more details, please refer to scikit-build documentation:

        http://scikit-build.readthedocs.io/en/latest/generators.html#linux

                      ********************************************************************************
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
  Building wheel for hnswlib (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for hnswlib (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [55 lines of output]
      running bdist_wheel
      running build
      running build_ext
      creating tmp
      ccache gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -I/home/abhishek/Documents/localGPT/.venv/include -I/usr/include/python3.10 -c /tmp/tmpt9xac4ma.cpp -o tmp/tmpt9xac4ma.o -std=c++14
      ccache gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -I/home/abhishek/Documents/localGPT/.venv/include -I/usr/include/python3.10 -c /tmp/tmpkru11dic.cpp -o tmp/tmpkru11dic.o -std=c++11
      Traceback (most recent call last):
        File "/home/abhishek/Documents/localGPT/.venv/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
          main()
        File "/home/abhishek/Documents/localGPT/.venv/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/abhishek/Documents/localGPT/.venv/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 261, in build_wheel
          return _build_backend().build_wheel(wheel_directory, config_settings,
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 416, in build_wheel
          return self._build_with_temp_dir(['bdist_wheel'], '.whl',
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 401, in _build_with_temp_dir
          self.run_setup()
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 116, in <module>
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/__init__.py", line 107, in setup
          return distutils.core.setup(**attrs)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
          return run_commands(dist)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
          dist.run_commands()
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
          self.run_command(cmd)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 1234, in run_command
          super().run_command(command)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/wheel/bdist_wheel.py", line 343, in run
          self.run_command("build")
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
          self.distribution.run_command(command)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 1234, in run_command
          super().run_command(command)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/command/build.py", line 131, in run
          self.run_command(cmd_name)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
          self.distribution.run_command(command)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 1234, in run_command
          super().run_command(command)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 84, in run
          _build_ext.run(self)
        File "/tmp/pip-build-env-qj37ep6f/overlay/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 345, in run
          self.build_extensions()
        File "<string>", line 103, in build_extensions
        File "<string>", line 70, in cpp_flag
      RuntimeError: Unsupported compiler -- at least C++11 support is needed!
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for hnswlib
Failed to build llama-cpp-python hnswlib
ERROR: Could not build wheels for llama-cpp-python, hnswlib, which is required to install pyproject.toml-based projects
DarkAbhi commented 1 year ago

A chat with ChatGPT helped resolved this, sharing here for reference.

https://chat.openai.com/share/161a66d1-fe1a-4cd9-93e2-ef144471b522

adjiap commented 1 year ago

If you encounter an error while building a wheel during the pip install process, you may need to install a C++ compiler on your computer.

For Windows 10/11

To install a C++ compiler on Windows 10/11, follow these steps:

Install Visual Studio 2022. Make sure the following components are selected: Universal Windows Platform development C++ CMake tools for Windows

Had the same problem, followed this advice. I have a Win11, Conda=23.5

Here's a screen shot from my VS Installer image

yhaiqiang commented 1 year ago

I run the following as @Otoliths suggested and it solved the issue

conda install -c conda-forge c-compiler conda install -c "conda-forge/label/cf202003" c-compiler

thanks. it work