abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
8.17k stars 974 forks source link

Unable to pip install #1804

Closed chinthasaicharan closed 1 month ago

chinthasaicharan commented 1 month ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

Should have installed the llama-cpp-python

Current Behavior

gave errors - did not install

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.

$ lscpu

$ uname -a

$ python3 --version
$ make --version
$ g++ --version

Failure Information (for bugs)

Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

Steps to Reproduce

pip install

Note: Many issues seem to be regarding functional or performance issues / differences with llama.cpp. In these cases we need to confirm that you're comparing against the version of llama.cpp that was built with your python package, and which parameters you're passing to the context.

Try the following:

  1. git clone https://github.com/abetlen/llama-cpp-python
  2. cd llama-cpp-python
  3. rm -rf _skbuild/ # delete any old builds
  4. python -m pip install .
  5. cd ./vendor/llama.cpp
  6. Follow llama.cpp's instructions to cmake llama.cpp
  7. Run llama.cpp's ./main with the same arguments you previously passed to llama-cpp-python and see if you can reproduce the issue. If you can, log an issue with llama.cpp

Failure Logs

Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: typing-extensions>=4.5.0 in c:\users\saich\appdata\local\programs\python\python310\lib\site-packages (from llama_cpp_python==0.3.1) (4.9.0) Requirement already satisfied: numpy>=1.20.0 in c:\users\saich\appdata\local\programs\python\python310\lib\site-packages (from llama_cpp_python==0.3.1) (1.23.3) Collecting diskcache>=5.6.1 (from llama_cpp_python==0.3.1) Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Requirement already satisfied: jinja2>=2.11.3 in c:\users\saich\appdata\local\programs\python\python310\lib\site-packages (from llama_cpp_python==0.3.1) (3.1.2) Requirement already satisfied: MarkupSafe>=2.0 in c:\users\saich\appdata\local\programs\python\python310\lib\site-packages (from jinja2>=2.11.3->llama_cpp_python==0.3.1) (2.1.2) Using cached diskcache-5.6.3-py3-none-any.whl (45 kB) Building wheels for collected packages: llama_cpp_python Building wheel for llama_cpp_python (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for llama_cpp_python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [20 lines of output] scikit-build-core 0.10.7 using CMake 3.30.5 (wheel) Configuring CMake... 2024-10-21 18:28:13,191 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\saich\AppData\Local\Temp\tmphnxgeg2z\build\CMakeInit.txt -- Building for: NMake Makefiles CMake Error at CMakeLists.txt:3 (project): Running

     'nmake' '-?'

    failed with:

     no such file or directory

  CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
  CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
  -- Configuring incomplete, errors occurred!

  *** CMake configuration failed
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama_cpp_python Failed to build llama_cpp_python ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama_cpp_python)

Example environment info:

llama-cpp-python$ git log | head -1
commit 47b0aa6e957b93dbe2c29d53af16fbae2dd628f2

llama-cpp-python$ python3 --version
Python 3.10.10```
llama-cpp-python$ git log | head -1
commit 47b0aa6e957b93dbe2c29d53af16fbae2dd628f2

llama-cpp-python$ python3 --version
Python 3.10.10

llama-cpp-python$ pip list | egrep "uvicorn|fastapi|sse-starlette|numpy"
fastapi                  0.95.0
numpy                    1.24.3
sse-starlette            1.3.3
uvicorn                  0.21.1

llama-cpp-python$ pip list | egrep "uvicorn|fastapi|sse-starlette|numpy"
fastapi                  0.95.0
numpy                    1.24.3
sse-starlette            1.3.3
uvicorn                  0.21.1

llama-cpp-python/vendor/llama.cpp$ git log | head -3
commit 66874d4fbcc7866377246efbcee938e8cc9c7d76
Author: Kerfuffle <44031344+KerfuffleV2@users.noreply.github.com>
Date:   Thu May 25 20:18:01 2023 -0600