Open Antwa-sensei253 opened 1 year ago
If you look at the error you can clearly see that it is complaining about the build paths being too long.
Instead of using C:\Users\king\Documents\oobabooga_windows
rename oobabooga_windows
to ow
:
C:\Users\king\Documents\ow
I've a similar issue on a Windows 11 machine:
(localGPT) C:\Sources\localGPT>set CMAKE_ARGS="-DLLAMA_CUBLAS=on"
(localGPT) C:\Sources\localGPT>set CMAKE_ARGS="-DLLAMA_CUBLAS=on"
(localGPT) C:\Sources\localGPT>pip install llama-cpp-python --no-cache-dir
Collecting llama-cpp-python
Downloading llama_cpp_python-0.1.77.tar.gz (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 4.3 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in c:\users\coder\miniconda3\envs\localgpt\lib\site-packages (from llama-cpp-python) (4.7.1)
Requirement already satisfied: numpy>=1.20.0 in c:\users\coder\miniconda3\envs\localgpt\lib\site-packages (from llama-cpp-python) (1.25.2)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
Downloading diskcache-5.6.1-py3-none-any.whl (45 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.6/45.6 kB ? eta 0:00:00
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [162 lines of output]
--------------------------------------------------------------------------------
-- Trying 'Ninja (Visual Studio 17 2022 x64 v143)' generator
--------------------------------
---------------------------
----------------------
-----------------
------------
-------
--
Not searching for unused variables given on the command line.
CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
Compatibility with CMake < 3.5 will be removed from a future version of
CMake.
Update the VERSION argument <min> value or use a ...<max> suffix to tell
CMake that the project does not need compatibility with older versions.
-- The C compiler identification is MSVC 19.36.32537.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.36.32532/bin/Hostx86/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- The CXX compiler identification is MSVC 19.36.32537.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.36.32532/bin/Hostx86/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Configuring done (2.5s)
-- Generating done (0.0s)
-- Build files have been written to: C:/Users/Coder/AppData/Local/Temp/pip-install-7ungfhva/llama-cpp-python_fd70b59e8083468396b7c5f803b25431/_cmake_test_compile/build
--
-------
------------
-----------------
----------------------
---------------------------
--------------------------------
-- Trying 'Ninja (Visual Studio 17 2022 x64 v143)' generator - success
--------------------------------------------------------------------------------
Configuring Project
Working directory:
C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431\_skbuild\win-amd64-3.11\cmake-build
Command:
'C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\cmake\data\bin/cmake.exe' 'C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431' -G Ninja '-DCMAKE_MAKE_PROGRAM:FILEPATH=C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\ninja\data\bin\ninja' -D_SKBUILD_FORCE_MSVC=1930 --no-warn-unused-cli '-DCMAKE_INSTALL_PREFIX:PATH=C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431\_skbuild\win-amd64-3.11\cmake-install' -DPYTHON_VERSION_STRING:STRING=3.11.4 -DSKBUILD:INTERNAL=TRUE '-DCMAKE_MODULE_PATH:PATH=C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\skbuild\resources\cmake' '-DPYTHON_EXECUTABLE:PATH=C:\Users\Coder\miniconda3\envs\localGPT\python.exe' '-DPYTHON_INCLUDE_DIR:PATH=C:\Users\Coder\miniconda3\envs\localGPT\Include' '-DPYTHON_LIBRARY:PATH=C:\Users\Coder\miniconda3\envs\localGPT\libs\python311.lib' '-DPython_EXECUTABLE:PATH=C:\Users\Coder\miniconda3\envs\localGPT\python.exe' '-DPython_ROOT_DIR:PATH=C:\Users\Coder\miniconda3\envs\localGPT' -DPython_FIND_REGISTRY:STRING=NEVER '-DPython_INCLUDE_DIR:PATH=C:\Users\Coder\miniconda3\envs\localGPT\Include' '-DPython_LIBRARY:PATH=C:\Users\Coder\miniconda3\envs\localGPT\libs\python311.lib' '-DPython3_EXECUTABLE:PATH=C:\Users\Coder\miniconda3\envs\localGPT\python.exe' '-DPython3_ROOT_DIR:PATH=C:\Users\Coder\miniconda3\envs\localGPT' -DPython3_FIND_REGISTRY:STRING=NEVER '-DPython3_INCLUDE_DIR:PATH=C:\Users\Coder\miniconda3\envs\localGPT\Include' '-DPython3_LIBRARY:PATH=C:\Users\Coder\miniconda3\envs\localGPT\libs\python311.lib' '-DCMAKE_MAKE_PROGRAM:FILEPATH=C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\ninja\data\bin\ninja' '"-DLLAMA_CUBLAS=on"' -DCMAKE_BUILD_TYPE:STRING=Release -DLLAMA_CUBLAS=on
Not searching for unused variables given on the command line.
CMake Warning:
Ignoring extra path from command line:
""-DLLAMA_CUBLAS=on""
-- The C compiler identification is MSVC 19.36.32537.0
-- The CXX compiler identification is MSVC 19.36.32537.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.36.32532/bin/Hostx86/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.36.32532/bin/Hostx86/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.41.0.windows.1")
fatal: not a git repository (or any of the parent directories): .git
fatal: not a git repository (or any of the parent directories): .git
CMake Warning at vendor/llama.cpp/CMakeLists.txt:116 (message):
Git repository not found; to enable automatic generation of build info,
make sure Git is installed and the project is a Git repository.
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- Could not find nvcc, please set CUDAToolkit_ROOT.
CMake Warning at vendor/llama.cpp/CMakeLists.txt:283 (message):
cuBLAS not found
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- x86 detected
-- Configuring done (2.2s)
-- Generating done (0.0s)
-- Build files have been written to: C:/Users/Coder/AppData/Local/Temp/pip-install-7ungfhva/llama-cpp-python_fd70b59e8083468396b7c5f803b25431/_skbuild/win-amd64-3.11/cmake-build
[1/7] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\k_quants.c.obj
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_math.h(44): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_math.h(963): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_memory.h(76): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_wstring.h(573): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\string.h(531): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\malloc.h(173): warning C5105: macro expansion producing 'defined' has undefined behavior
[2/7] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj
FAILED: vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.obj
C:\PROGRA~1\MICROS~3\2022\COMMUN~1\VC\Tools\MSVC\1436~1.325\bin\Hostx86\x64\cl.exe /nologo -DGGML_USE_K_QUANTS -D_CRT_SECURE_NO_WARNINGS -IC:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431\vendor\llama.cpp\. /DWIN32 /D_WINDOWS /O2 /Ob2 /DNDEBUG -std:c11 -MD /arch:AVX2 /showIncludes /Fovendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj /Fdvendor\llama.cpp\CMakeFiles\ggml.dir\ /FS -c C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431\vendor\llama.cpp\ggml.c
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\malloc.h(173): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\time.h(589): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_math.h(44): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_math.h(963): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_search.h(188): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\stdlib.h(79): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\stdlib.h(1286): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_memory.h(76): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\corecrt_wstring.h(573): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\string.h(531): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\stdio.h(378): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\stdio.h(2437): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\float.h(328): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\ucrt\ctype.h(241): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\winbase.h(9254): warning C5105: macro expansion producing 'defined' has undefined behavior
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(487): warning C5103: pasting '/' and '/' does not result in a valid preprocessing token
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\shared\wtypes.h(745): note: in expansion of macro '_VARIANT_BOOL'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(487): error C2059: syntax error: '/'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(502): warning C5103: pasting '/' and '/' does not result in a valid preprocessing token
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\shared\wtypes.h(745): note: in expansion of macro '_VARIANT_BOOL'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(502): error C2059: syntax error: '/'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(530): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(531): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(533): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(534): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(665): error C2079: 'varDefaultValue' uses undefined struct 'tagVARIANT'
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(950): error C2079: 'varValue' uses undefined struct 'tagVARIANT'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(319): warning C5103: pasting '/' and '/' does not result in a valid preprocessing token
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\shared\wtypes.h(745): note: in expansion of macro '_VARIANT_BOOL'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(319): error C2059: syntax error: '/'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(378): error C2371: 'pvarVal': redefinition; different basic types
C:\Program Files (x86)\Windows Kits\10\include\10.0.17763.0\um\oaidl.h(510): note: see declaration of 'pvarVal'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(379): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(380): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(383): error C2059: syntax error: '}'
C:\Program Files (x86)\Windows Kits\10\\include\10.0.17763.0\\um\propidlbase.h(384): error C2059: syntax error: '}'
[3/7] Building CXX object vendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\skbuild\setuptools_wrap.py", line 674, in setup
cmkr.make(make_args, install_target=cmake_install_target, env=env)
File "C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\skbuild\cmaker.py", line 697, in make
self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
File "C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\skbuild\cmaker.py", line 742, in make_impl
raise SKBuildError(msg)
An error occurred while building with CMake.
Command:
'C:\Users\Coder\AppData\Local\Temp\pip-build-env-vs5ug9bk\overlay\Lib\site-packages\cmake\data\bin/cmake.exe' --build . --target install --config Release --
Install target:
install
Source directory:
C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431
Working directory:
C:\Users\Coder\AppData\Local\Temp\pip-install-7ungfhva\llama-cpp-python_fd70b59e8083468396b7c5f803b25431\_skbuild\win-amd64-3.11\cmake-build
Please check the install target is valid and see CMake's output for more information.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Display adapter: Intel Iris Xe Graphics
This is the setup.py
exception captured with VSCode in debug mode (setuptools_wrap.py
@ line 677):
An error occurred while building with CMake.
Command:
'C:\\Users\\Coder\\.pyenv\\pyenv-win\\versions\\3.10.5\\lib\\site-packages\\cmake\\data\\bin/cmake.exe' --build . --target install --config Release --
Install target:
install
Source directory:
C:\\Users\\Coder\\Downloads\\llama_cpp_python-0.1.77
Working directory:
C:\\Users\\Coder\\Downloads\\llama_cpp_python-0.1.77\\_skbuild\\win-amd64-3.10\\cmake-build
Please check the install target is valid and see CMake's output for more information.
I am running the latest code I carefully followed the README.md. I searched for my issue but no hope.
I am trying to enable gpu overloading for my ggml model "TheBloke_Wizard-Vicuna-13B-Uncensored-GGML" with these commands :
1-pip uninstall -y llama-cpp-python 2-set CMAKE_ARGS="-DLLAMA_CUBLAS=on" 3-set FORCE_CMAKE=1 4-pip install llama-cpp-python --no-cache-dir but step four did not work, also tried pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir, but no hope.
when I try normal pip install llama-cpp-python it works correctly but with no --no-cache-dir
this is the log:
cmake version: 3.27.1 I did try to install visual studio and download UWP and cmake
I am running: windows 10 radeon r5 5600g CPU radeon RX 750 GPU