Open jreves opened 11 months ago
Hey @jreves do you encounter the same issue if you try to compile llama.cpp
with cmake?
I'll give that a try and report back - thanks!
Ok - I'm not really a Windows developer, but here's what I was able to test:
Following the instructions for "Building llama with CLBlast" on windows with Cmake on this page( https://github.com/ggerganov/llama.cpp), I I got very similar results - again, referencing the version of CLBlast I installed using "conda install". Here's the output:
c:\Temp\llama.cpp\build>cmake .. -DLLAMA_CLBLAST=ON -DCLBlast_DIR=C:\Users\joere\anaconda3\envs\llama\Library\lib\cmake\CLBlast -- Building for: Visual Studio 17 2022 -- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.19045. -- The C compiler identification is MSVC 19.38.33130.0 -- The CXX compiler identification is MSVC 19.38.33130.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.42.0.windows.2") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- CLBlast found -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed -- Configuring done (9.6s) -- Generating done (0.5s) -- Build files have been written to: C:/Temp/llama.cpp/build
c:\Temp\llama.cpp\build>cmake --build . --config Release MSBuild version 17.8.3+195e7f5a3 for .NET Framework
1>Checking Build System Generating build details from Git -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.42.0.windows.2") Building Custom Rule C:/Temp/llama.cpp/common/CMakeLists.txt build-info.cpp build_info.vcxproj -> C:\Temp\llama.cpp\build\common\build_info.dir\Release\build_info.lib Building Custom Rule C:/Temp/llama.cpp/CMakeLists.txt ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c C:\Temp\llama.cpp\ggml-backend.c(875,21): warning C4477: 'fprintf' : format string '%lu' requires an argument of type ' unsigned long', but variadic argument 1 has type 'unsigned __int64' [C:\Temp\llama.cpp\build\ggml.vcxproj] C:\Temp\llama.cpp\ggml-backend.c(875,21): consider using '%llu' in the format string C:\Temp\llama.cpp\ggml-backend.c(875,21): consider using '%Iu' in the format string C:\Temp\llama.cpp\ggml-backend.c(875,21): consider using '%I64u' in the format string
C:\Temp\llama.cpp\ggml-quants.c(627,26): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\Temp\llama.cpp\build\ggml.vcxproj] C:\Temp\llama.cpp\ggml-quants.c(845,36): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\Temp\llama.cpp\build\ggml.vcxproj] C:\Temp\llama.cpp\ggml-quants.c(846,36): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\Temp\llama.cpp\build\ggml.vcxproj] Generating Code... ggml-opencl.cpp C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\include\xkeycheck.h(54,1): error C118 9: #error: The C++ Standard Library forbids macroizing the keyword "bool". Enable warning C4005 to find the forbidden define. [C:\Temp\llama.cpp\build\ggml.vcxproj] (compiling source file '../ggml-opencl.cpp')
I don't have a gcc compiler on this windows machine, so this is with the latest MSVC. I don't have enough information to conclude there's an issue with the version of CLBlast I have, but it seems like there's a mismatch somewhere.
Reviewing the instructions, I discovered there's an option to compile with hipBLAS; I was able to install ROCm, and successfully compile llama.cpp with hipBLAS. I'm still trying to get some appropriate models downloaded and figure out the best way to test this standalone. I still need python bindings, and to integrate this into my conda virtual environment.
Compiling with the hipBLAS option produces a Bazillion warnings about logging, but it compiles with zero errors. So - for my purposes, I'm going to go with this option.
Is there anything else I can test for you?
OK - having successfully compiled llama.cpp standalone for HIP, I'm now struggling through #695 to try to get a clean pip install....
Is it not possible to integrate the python bindings into an existing custom / working llama.cpp build?
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
I expect to use llama on my Windows 10 machine with my AMD GPU. I'm attempting to install with CLBlast, which the build seems to detect. I'm installing in an anaconda virtual environment, and I expect to be able to import a model, and initialize a llama instance to run through jupyter notebook.
I installed CLBlast through conda, and it appears to have installed successfully. It's visible in my environment. Next, I'm trying to use pip to install llama-cpp-python with the following command:
(llama) C:\Users\joere pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir --verbose
(llama) is the name of my virtual environment.
I set the following environment variables: CMAKE_ARGS=-DLLAMA_CLBLAST=on LLAMA_CLBLAST=1 FORCE_CMAKE=1 IgnoreWarnIntDirInTempDetected=true
I'm expecting the build process for llama-cpp-python to recognize and use the existing CLBlast installation, and complete the build successfully. I would expect to see my instances execute on my GPU. I can't get a successful build.
Current Behavior
Executing in the environment where I've installed CLBlast, it's apparently detected. But the compilation of ggml-opencl.cpp fails with an error: error C1189: #error: The C++ Standard Library forbids macroizing the keyword "bool".
Executing the install command for llama-cpp-python in an environment that does not have access to the CLBlast installation reports that it's not detected, and then completes the build and installation successfully. I can then observe BLAS=0, and it's executing on my CPU. If it detects CLBLast, then the build fails.
Here's the verbose output:
(llama) C:\Users\joere>pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir --verbose Using pip 23.3 from C:\Users\joere\anaconda3\envs\llama\lib\site-packages\pip (python 3.10) Collecting llama-cpp-python Downloading llama_cpp_python-0.2.19.tar.gz (7.8 MB) ---------------------------------------- 7.8/7.8 MB 15.6 MB/s eta 0:00:00 Running command pip subprocess to install build dependencies Collecting scikit-build-core>=0.5.1 (from scikit-build-core[pyproject]>=0.5.1) Using cached scikit_build_core-0.6.1-py3-none-any.whl.metadata (17 kB) Collecting exceptiongroup (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB) Collecting packaging>=20.9 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB) Collecting tomli>=1.1 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached tomli-2.0.1-py3-none-any.whl (12 kB) Collecting pathspec>=0.10.1 (from scikit-build-core[pyproject]>=0.5.1) Using cached pathspec-0.11.2-py3-none-any.whl.metadata (19 kB) Collecting pyproject-metadata>=0.5 (from scikit-build-core[pyproject]>=0.5.1) Using cached pyproject_metadata-0.7.1-py3-none-any.whl (7.4 kB) Using cached scikit_build_core-0.6.1-py3-none-any.whl (134 kB) Using cached packaging-23.2-py3-none-any.whl (53 kB) Using cached pathspec-0.11.2-py3-none-any.whl (29 kB) Using cached exceptiongroup-1.2.0-py3-none-any.whl (16 kB) Installing collected packages: tomli, pathspec, packaging, exceptiongroup, scikit-build-core, pyproject-metadata Successfully installed exceptiongroup-1.2.0 packaging-23.2 pathspec-0.11.2 pyproject-metadata-0.7.1 scikit-build-core-0.6.1 tomli-2.0.1 Installing build dependencies ... done Running command Getting requirements to build wheel Getting requirements to build wheel ... done Running command pip subprocess to install backend dependencies Collecting cmake>=3.21 Using cached cmake-3.27.7-py2.py3-none-win_amd64.whl.metadata (6.8 kB) Using cached cmake-3.27.7-py2.py3-none-win_amd64.whl (34.6 MB) Installing collected packages: cmake Successfully installed cmake-3.27.7 Installing backend dependencies ... done Running command Preparing metadata (pyproject.toml) scikit-build-core 0.6.1 using CMake 3.27.7 (metadata_wheel) Preparing metadata (pyproject.toml) ... done Collecting typing-extensions>=4.5.0 (from llama-cpp-python) Obtaining dependency information for typing-extensions>=4.5.0 from https://files.pythonhosted.org/packages/24/21/7d397a4b7934ff4028987914ac1044d3b7d52712f30e2ac7a2ae5bc86dd0/typing_extensions-4.8.0-py3-none-any.whl.metadata Downloading typing_extensions-4.8.0-py3-none-any.whl.metadata (3.0 kB) Collecting numpy>=1.20.0 (from llama-cpp-python) Obtaining dependency information for numpy>=1.20.0 from https://files.pythonhosted.org/packages/24/b5/fed6f7e582937eb947369dccf6c94602598a25f23e482d1b1f2299159328/numpy-1.26.2-cp310-cp310-win_amd64.whl.metadata Downloading numpy-1.26.2-cp310-cp310-win_amd64.whl.metadata (61 kB) ---------------------------------------- 61.2/61.2 kB ? eta 0:00:00 Collecting diskcache>=5.6.1 (from llama-cpp-python) Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata Downloading diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Downloading diskcache-5.6.3-py3-none-any.whl (45 kB) ---------------------------------------- 45.5/45.5 kB ? eta 0:00:00 Downloading numpy-1.26.2-cp310-cp310-win_amd64.whl (15.8 MB) ---------------------------------------- 15.8/15.8 MB 31.2 MB/s eta 0:00:00 Downloading typing_extensions-4.8.0-py3-none-any.whl (31 kB) Building wheels for collected packages: llama-cpp-python Running command Building wheel for llama-cpp-python (pyproject.toml) scikit-build-core 0.6.1 using CMake 3.27.7 (wheel) *** Configuring CMake... 2023-11-24 18:11:54,242 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\CMakeInit.txt -- Building for: Visual Studio 17 2022 -- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.19045. -- The C compiler identification is MSVC 19.38.33130.0 -- The CXX compiler identification is MSVC 19.38.33130.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- CLBlast found -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: x64 -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed CMake Warning (dev) at CMakeLists.txt:20 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.
CMake Warning (dev) at CMakeLists.txt:29 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.
-- Configuring done (14.4s) -- Generating done (0.1s) -- Build files have been written to: C:/Users/joere/AppData/Local/Temp/tmp3m1qyz5s/build *** Building project with Visual Studio 17 2022... Change Dir: 'C:/Users/joere/AppData/Local/Temp/tmp3m1qyz5s/build'
Run Build Command(s): "C:/Program Files/Microsoft Visual Studio/2022/Community/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n MSBuild version 17.8.3+195e7f5a3 for .NET Framework Build started 11/24/2023 18:12:09.
Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" on node 1 (default targets). Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets). PrepareForBuild: Creating directory "x64\Release\ZERO_CHECK\". Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\". InitializeBuildStatus: Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". CustomBuild: 1>Checking Build System FinalizeBuildStatus: Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate". Done Building Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ZERO_CHECK.vcxproj" (default targets). Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets). PrepareForBuild: Creating directory "build_info.dir\Release\". Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "build_info.dir\Release\build_info.tlog\". InitializeBuildStatus: Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". CustomBuild: Generating build details from Git -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.42.0.windows.2") Building Custom Rule C:/Users/joere/AppData/Local/Temp/pip-install-vwmucycq/llama-cpp-python_d752d1afbe344b418390ab5757a27b06/vendor/llama.cpp/common/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_CLBLAST /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\common\build-info.cpp" build-info.cpp Lib: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj" build_info.vcxproj -> C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib FinalizeBuildStatus: Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate". Done Building Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets). Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj" (4) on node 1 (default targets). PrepareForBuild: Creating directory "ggml.dir\Release\". Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "ggml.dir\Release\ggml.tlog\". InitializeBuildStatus: Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/joere/AppData/Local/Temp/pip-install-vwmucycq/llama-cpp-python_d752d1afbe344b418390ab5757a27b06/vendor/llama.cpp/CMakeLists.txt ClCompile: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_USE_CLBLAST /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11 /Fo"ggml.dir\Release\" /Fd"ggml.dir\Release\ggml.pdb" /external:W0 /Gd /TC /errorReport:queue /external:I "C:/Users/joere/anaconda3/envs/llama/Library/include" "C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml.c" "C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-alloc.c" "C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c" "C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-quants.c" ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c(875,21): warning C4477: 'fprintf' : format string '%lu' requires an argument of type 'unsigned long', but variadic argument 1 has type 'unsigned __int64' [C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj] C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c(875,21): consider using '%llu' in the format string C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c(875,21): consider using '%Iu' in the format string C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c(875,21): consider using '%I64u' in the format string
C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\include\xkeycheck.h(54,1): error C1189: #error: The C++ Standard Library forbids macroizing the keyword "bool". Enable warning C4005 to find the forbidden define. [C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj] (compiling source file '../../../../pip-install-vwmucycq/llama-cpp-python_d752d1afbe344b418390ab5757a27b06/vendor/llama.cpp/ggml-opencl.cpp')
Done Building Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj" (default targets) -- FAILED. Done Building Project "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (default targets) -- FAILED.
Build FAILED.
"C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj" (default target) (4) -> (ClCompile target) -> C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06\vendor\llama.cpp\ggml-backend.c(875,21): warning C4477: 'fprintf' : format string '%lu' requires an argument of type 'unsigned long', but variadic argument 1 has type 'unsigned __int64' [C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj]
"C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj" (default target) (4) -> (ClCompile target) -> C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.38.33130\include\xkeycheck.h(54,1): error C1189: #error: The C++ Standard Library forbids macroizing the keyword "bool". Enable warning C4005 to find the forbidden define. [C:\Users\joere\AppData\Local\Temp\tmp3m1qyz5s\build\vendor\llama.cpp\ggml.vcxproj]
Time Elapsed 00:00:02.36
*** CMake build failed error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip. full command: 'C:\Users\joere\anaconda3\envs\llama\python.exe' 'C:\Users\joere\anaconda3\envs\llama\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py' build_wheel 'C:\Users\joere\AppData\Local\Temp\tmp1mk2dgh1' cwd: C:\Users\joere\AppData\Local\Temp\pip-install-vwmucycq\llama-cpp-python_d752d1afbe344b418390ab5757a27b06 Building wheel for llama-cpp-python (pyproject.toml) ... error ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
AdapterCompatibility Name Advanced Micro Devices, Inc. AMD Radeon RX 6900 XT
Host Name: DESKTOP-U0LG59P OS Name: Microsoft Windows 10 Pro OS Version: 10.0.19045 N/A Build 19045 OS Manufacturer: Microsoft Corporation OS Configuration: Standalone Workstation OS Build Type: Multiprocessor Free Registered Owner: joe.reves@att.net Registered Organization: Product ID: 00330-71448-41750-AAOEM Original Install Date: 4/12/2021, 5:24:53 System Boot Time: 11/16/2023, 9:45:34 System Manufacturer: Micro-Star International Co., Ltd. System Model: MS-7C35 System Type: x64-based PC Processor(s): 1 Processor(s) Installed. 01: AMD64 Family 23 Model 113 Stepping 0 AuthenticAMD ~3800 Mhz BIOS Version: American Megatrends International, LLC. A.80, 1/22/2021 Windows Directory: C:\Windows System Directory: C:\Windows\system32 Boot Device: \Device\HarddiskVolume1 System Locale: en-us;English (United States) Input Locale: en-us;English (United States) Time Zone: (UTC-06:00) Central Time (US & Canada) Total Physical Memory: 65,457 MB Available Physical Memory: 38,351 MB Virtual Memory: Max Size: 75,185 MB Virtual Memory: Available: 40,122 MB Virtual Memory: In Use: 35,063 MB Page File Location(s): C:\pagefile.sys Domain: WORKGROUP Logon Server: \DESKTOP-U0LG59P Hotfix(s): 36 Hotfix(s) Installed.
Network Card(s): 5 NIC(s) Installed. 01: Realtek PCIe 2.5GbE Family Controller Connection Name: Ethernet Status: Media disconnected [02]: Intel(R) Wi-Fi 6 AX200 160MHz Connection Name: Wi-Fi DHCP Enabled: Yes DHCP Server: 192.168.128.1 IP address(es)
Hyper-V Requirements: VM Monitor Mode Extensions: Yes Virtualization Enabled In Firmware: No Second Level Address Translation: Yes Data Execution Prevention Available: Yes
OS Name: Microsoft Windows 10 Pro OS Version: 10.0.19045 N/A Build 19045
(llama) C:\Users\joere>cmake -version cmake version 3.27.7
(llama) C:\Users\joere>conda -V conda 23.10.0
(llama) C:\Users\joere>conda list clblast
packages in environment at C:\Users\joere\anaconda3\envs\llama:
#
Name Version Build Channel
clblast 1.5.2 h2d74725_0 conda-forge
Microsoft Visual Studio Community 2022 Version 17.8.1 VisualStudio.17.Release/17.8.1+34316.72 Microsoft .NET Framework Version 4.8.09037
Installed Version: Community
Visual C++ 2022 00482-90000-00000-AA776 Microsoft Visual C++ 2022
ASP.NET and Web Tools 17.8.352.38654 ASP.NET and Web Tools
Azure App Service Tools v3.0.0 17.8.352.38654 Azure App Service Tools v3.0.0
C# Tools 4.8.0-7.23558.1+e091728607ca0fc9efca55ccfb3e59259c6b5a0a C# components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Cookiecutter 17.0.23262.1 Provides tools for finding, instantiating and customizing templates in cookiecutter format.
Linux Core Dump Debugging 1.0.9.34309 Enables debugging of Linux core dumps.
Microsoft JVM Debugger 1.0 Provides support for connecting the Visual Studio debugger to JDWP compatible Java Virtual Machines
NuGet Package Manager 6.8.0 NuGet Package Manager in Visual Studio. For more information about NuGet, visit https://docs.nuget.org/
Python - Profiling support 17.0.23262.1 Profiling support for Python projects.
Python with Pylance 17.0.23262.1 Provides IntelliSense, projects, templates, debugging, interactive windows, and other support for Python developers.
TypeScript Tools 17.0.20920.2001 TypeScript Tools for Microsoft Visual Studio
Visual Basic Tools 4.8.0-7.23558.1+e091728607ca0fc9efca55ccfb3e59259c6b5a0a Visual Basic components used in the IDE. Depending on your project type and settings, a different version of the compiler may be used.
Visual C++ for Cross Platform Mobile Development (Android) 15.0.34205.153 Visual C++ for Cross Platform Mobile Development (Android)
Visual C++ for Linux Development 1.0.9.34309 Visual C++ for Linux Development
Visual Studio IntelliCode 2.2 AI-assisted development for Visual Studio.