ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://parisneo.github.io/lollms-webui/
Apache License 2.0
4.23k stars 535 forks source link

Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. #259

Open rempel1234 opened 1 year ago

rempel1234 commented 1 year ago

Expected Behavior

Able to select llama-cpp-python binding without errors

Current Behavior

Building wheels for collected packages: llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [72 lines of output] -- Trying 'Ninja' generator Not searching for unused variables given on the command line. -- The C compiler identification is unknown CMake Error at CMakeLists.txt:3 (ENABLE_LANGUAGE): No CMAKE_C_COMPILER could be found.

    Tell CMake where to find the compiler by setting either the environment
    variable "CC" or the CMake cache entry CMAKE_C_COMPILER to the full path to
    the compiler, or to the compiler name if it is in the PATH.

  -- Configuring incomplete, errors occurred!
  -- Trying 'Ninja' generator - failure
  -- Trying 'Unix Makefiles' generator
  CMake Error: CMake was unable to find a build program corresponding to "Unix Makefiles".  CMAKE_MAKE_PROGRAM is not set.  You probably need to select a different build tool.Not searching for unused variables given on the command line.

  -- Configuring incomplete, errors occurred!
  -- Trying 'Unix Makefiles' generator - failure
  --------------------------------------------------------------------------------

                  ********************************************************************************
                  scikit-build could not get a working generator for your system. Aborting build.

                  Building Linux wheels for Python 3.10 requires a compiler (e.g gcc).
  But scikit-build does *NOT* know how to install it on ubuntu

  To build compliant wheels, consider using the manylinux system described in PEP-513.
  Get it with "dockcross/manylinux-x64" docker image:

    https://github.com/dockcross/dockcross#readme

  For more details, please refer to scikit-build documentation:

    http://scikit-build.readthedocs.io/en/latest/generators.html#linux

                  ********************************************************************************
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

Steps to Reproduce

Please provide detailed steps to reproduce the issue.

  1. Launch Ubuntu-22.04 WSL
  2. run git clone ...
  3. cd into gpt4all-ui
  4. ./setup.sh
  5. Browse to 127.0.0.1:9600
  6. Click Settings
  7. Click Binding zoo
  8. Click llama_cpp_official

Possible Solution

Add warning in description to binding, that c compiler and make are required

Context

Please provide any additional context about the issue.

Screenshots

If applicable, add screenshots to help explain the issue.

WillemPieterse commented 1 year ago

I found a possible solution.... activate environment using the scripts directory sh file when environment is active install Llama_cpp from wheel V1.49

pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.49/llama_cpp_python-0.1.49-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl

Go to the releases to find the relevant arch wheel if needed. Rerun the webui.sh file to start the site