Open himanshus110 opened 1 year ago
Keen to know this one as well. Having the same issue.
macos cpu only , the same error after
make docker
, make server-run
, make build
I'm using conda environment. I've set CMAKE_C_COMPILER and CMAKE_CXX_COMPILER variables but still this error shows that these variables aren't set
Arch linux here and I get this problem as well
I clone the llama-cpp submodule directly in Dockerfile:
@himanshus110 - I managed to install it in windows with few changes. Could you try pulling from this fork and follow the extra steps mentioned here?
I have created a PR #848 to include it in main repo. Hopefully it works for you too.
NOTE: OpenBLAS can still be tricky. You can try without -DLLAMA_OPENBLAS=on
argument too if it complains about OpenBLAS.
`pip install llama-cpp-python
Collecting llama-cpp-python
Using cached llama_cpp_python-0.2.79.tar.gz (50.3 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in e:\stable-diffusion-webui\extensions\comfyui_windows_portable\comfyui\venv\lib\site-packages (from llama-cpp-python) (4.12.2)
Requirement already satisfied: numpy>=1.20.0 in e:\stable-diffusion-webui\extensions\comfyui_windows_portable\comfyui\venv\lib\site-packages (from llama-cpp-python) (1.26.4)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Requirement already satisfied: jinja2>=2.11.3 in e:\stable-diffusion-webui\extensions\comfyui_windows_portable\comfyui\venv\lib\site-packages (from llama-cpp-python) (3.1.4)
Requirement already satisfied: MarkupSafe>=2.0 in e:\stable-diffusion-webui\extensions\comfyui_windows_portable\comfyui\venv\lib\site-packages (from jinja2>=2.11.3->llama-cpp-python) (2.1.5)
Using cached diskcache-5.6.3-py3-none-any.whl (45 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [20 lines of output] scikit-build-core 0.9.6 using CMake 3.28.1 (wheel) Configuring CMake... 2024-06-20 11:30:31,404 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\mfker\AppData\Local\Temp\tmphjhyhtin\build\CMakeInit.txt -- Building for: NMake Makefiles CMake Error at CMakeLists.txt:3 (project): Running
'nmake' '-?'
failed with:
no such file or directory
CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
*** CMake configuration failed
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects ` some error with the pip install. thinks your help!
Have same issue, 1 year already, anyone figured out ?
So I solved by installing earlier version
Have same issue, 1 year already, anyone figured out ?
So I solved by installing earlier version
Never thought of trying an older version, it worked for me too. Installed the v.0.2.70 . The changes should be investigated, maybe a bug on the requirements?
Have same issue, 1 year already, anyone figured out ? So I solved by installing earlier version
Never thought of trying an older version, it worked for me too. Installed the v.0.2.70 . The changes should be investigated, maybe a bug on the requirements?
I tried on older versions but without progress:
> pip install llama-cpp-python==0.2.70 --no-cache-dir
Collecting llama-cpp-python==0.2.70
Downloading llama_cpp_python-0.2.70.tar.gz (46.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.4/46.4 MB 11.6 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in ./venv/lib/python3.12/site-packages (from llama-cpp-python==0.2.70) (4.12.2)
Requirement already satisfied: numpy>=1.20.0 in ./venv/lib/python3.12/site-packages (from llama-cpp-python==0.2.70) (2.1.2)
Collecting diskcache>=5.6.1 (from llama-cpp-python==0.2.70)
Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB)
Requirement already satisfied: jinja2>=2.11.3 in ./venv/lib/python3.12/site-packages (from llama-cpp-python==0.2.70) (3.1.4)
Requirement already satisfied: MarkupSafe>=2.0 in ./venv/lib/python3.12/site-packages (from jinja2>=2.11.3->llama-cpp-python==0.2.70) (3.0.2)
Downloading diskcache-5.6.3-py3-none-any.whl (45 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [78 lines of output]
*** scikit-build-core 0.10.7 using CMake 3.30.5 (wheel)
*** Configuring CMake...
loading initial cache file /var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/tmpnay77ovj/build/CMakeInit.txt
-- The C compiler identification is AppleClang 16.0.0.16000026
-- The CXX compiler identification is AppleClang 16.0.0.16000026
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /usr/bin/git (found version "2.39.5 (Apple Git-154)")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Accelerate framework found
-- Metal framework found
-- The ASM compiler identification is AppleClang
-- Found assembler: /Library/Developer/CommandLineTools/usr/bin/gcc
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: arm64
-- ARM detected
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed
CMake Warning (dev) at vendor/llama.cpp/CMakeLists.txt:1270 (install):
Target llama has RESOURCE files but no RESOURCE DESTINATION.
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Warning (dev) at CMakeLists.txt:26 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Warning (dev) at CMakeLists.txt:35 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
This warning is for project developers. Use -Wno-dev to suppress it.
-- Configuring done (0.6s)
-- Generating done (0.0s)
-- Build files have been written to: /var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/tmpnay77ovj/build
*** Building project with Ninja...
Change Dir: '/var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/tmpnay77ovj/build'
Run Build Command(s): /private/var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/pip-build-env-1kbdcd09/normal/lib/python3.12/site-packages/ninja/data/bin/ninja -v
......
......
......
[11/30] /Library/Developer/CommandLineTools/usr/bin/gcc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_METAL_EMBED_LIBRARY -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_LLAMAFILE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/pip-install-9dh6g3in/llama-cpp-python_b06b60e65d3c4c63bcc9b3acbf8e5abc/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX15.0.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX15.0.sdk -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /private/var/folders/6q/dn566lyd20z9x1phnzlwgmlc0000gn/T/pip-install-9dh6g3in/llama-cpp-python_b06b60e65d3c4c63bcc9b3acbf8e5abc/vendor/llama.cpp/ggml.c
ninja: build stopped: subcommand failed.
*** CMake build failed
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: ERROR: Failed to build in
I'm doing it on Mac M1 with Python 3.12.7
Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [20 lines of output] scikit-build-core 0.5.1 using CMake 3.27.6 (wheel) Configuring CMake... 2023-10-03 20:07:26,143 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\h02si\AppData\Local\Temp\tmp95lm135w\build\CMakeInit.txt -- Building for: NMake Makefiles CMake Error at CMakeLists.txt:3 (project): Running
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects