abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
8.08k stars 962 forks source link

Windows Build Stuck at "Building wheel for llama-cpp-python (pyproject.toml) ... Generating Code..." #1714

Closed Orenji-Tangerine closed 2 months ago

Orenji-Tangerine commented 2 months ago

image

I have problem installing and I have installed C++ Build Tools but still could not get this installed. Previously, I have an older version in my python environmennt but it has problem installing the new version so I uninstalled the old one and now I am trying to install this new version but always stuck at this point.

image

abetlen commented 2 months ago

@Orenji-Tangerine can you try installing with --verbose and post the logs.

Orenji-Tangerine commented 2 months ago

@Orenji-Tangerine can you try installing with --verbose and post the logs.

Appreciate your help :)

A:\ComfyUI\python>python -m pip install llama-cpp-python --verbose Using pip 24.2 from A:\ComfyUI\python\lib\site-packages\pip (python 3.10) Collecting llama-cpp-python Downloading llama_cpp_python-0.2.90.tar.gz (63.8 MB) ---------------------------------------- 63.8/63.8 MB 34.4 MB/s eta 0:00:00 Running command pip subprocess to install build dependencies Using pip 24.2 from A:\ComfyUI\python\Lib\site-packages\pip (python 3.10) Collecting scikit-build-core>=0.9.2 (from scikit-build-core[pyproject]>=0.9.2) Obtaining dependency information for scikit-build-core>=0.9.2 from https://files.pythonhosted.org/packages/20/f0/11b0f09173051647af2e140f68f3d94432c5b41a6ea0d45a43e38ab68192/scikit_build_core-0.10.5-py3-none-any.whl.metadata Downloading scikit_build_core-0.10.5-py3-none-any.whl.metadata (20 kB) Collecting exceptiongroup>=1.0 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2) Obtaining dependency information for exceptiongroup>=1.0 from https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl.metadata Downloading exceptiongroup-1.2.2-py3-none-any.whl.metadata (6.6 kB) Collecting packaging>=21.3 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2) Obtaining dependency information for packaging>=21.3 from https://files.pythonhosted.org/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl.metadata Downloading packaging-24.1-py3-none-any.whl.metadata (3.2 kB) Collecting pathspec>=0.10.1 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2) Obtaining dependency information for pathspec>=0.10.1 from https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl.metadata Downloading pathspec-0.12.1-py3-none-any.whl.metadata (21 kB) Collecting tomli>=1.2.2 (from scikit-build-core>=0.9.2->scikit-build-core[pyproject]>=0.9.2) Obtaining dependency information for tomli>=1.2.2 from https://files.pythonhosted.org/packages/97/75/10a9ebee3fd790d20926a90a2547f0bf78f371b2f13aa822c759680ca7b9/tomli-2.0.1-py3-none-any.whl.metadata Downloading tomli-2.0.1-py3-none-any.whl.metadata (8.9 kB) Downloading scikit_build_core-0.10.5-py3-none-any.whl (164 kB) Downloading exceptiongroup-1.2.2-py3-none-any.whl (16 kB) Downloading packaging-24.1-py3-none-any.whl (53 kB) Downloading pathspec-0.12.1-py3-none-any.whl (31 kB) Downloading tomli-2.0.1-py3-none-any.whl (12 kB) Installing collected packages: tomli, pathspec, packaging, exceptiongroup, scikit-build-core ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. flet 0.23.2 requires packaging<24.0,>=23.1, but you have packaging 24.1 which is incompatible. mediapipe 0.10.14 requires protobuf<5,>=4.25.3, but you have protobuf 5.28.0 which is incompatible. mido 1.3.2 requires packaging~=23.1, but you have packaging 24.1 which is incompatible. torchscale 0.3.0 requires fairscale==0.4.0, but you have fairscale 0.4.13 which is incompatible. Successfully installed exceptiongroup-1.2.2 packaging-24.1 pathspec-0.12.1 scikit-build-core-0.10.5 tomli-2.0.1 Installing build dependencies ... done Running command Getting requirements to build wheel Getting requirements to build wheel ... done Running command pip subprocess to install backend dependencies Using pip 24.2 from A:\ComfyUI\python\Lib\site-packages\pip (python 3.10) Collecting cmake>=3.21 Obtaining dependency information for cmake>=3.21 from https://files.pythonhosted.org/packages/5b/34/a6a1030ec63da17e884bf2916f7ff92ad76f730d5e8edafd948b99c05384/cmake-3.30.2-py3-none-win_amd64.whl.metadata Downloading cmake-3.30.2-py3-none-win_amd64.whl.metadata (6.1 kB) Downloading cmake-3.30.2-py3-none-win_amd64.whl (35.6 MB) ---------------------------------------- 35.6/35.6 MB 32.3 MB/s eta 0:00:00 Installing collected packages: cmake Creating C:\Users\User\AppData\Local\Temp\pip-build-env-276ourug\normal\Scripts Successfully installed cmake-3.30.2 Installing backend dependencies ... done Running command Preparing metadata (pyproject.toml) scikit-build-core 0.10.5 using CMake 3.30.2 (metadata_wheel) Preparing metadata (pyproject.toml) ... done Requirement already satisfied: typing-extensions>=4.5.0 in a:\comfyui\python\lib\site-packages (from llama-cpp-python) (4.12.2) Requirement already satisfied: numpy>=1.20.0 in a:\comfyui\python\lib\site-packages (from llama-cpp-python) (1.26.4) Requirement already satisfied: diskcache>=5.6.1 in a:\comfyui\python\lib\site-packages (from llama-cpp-python) (5.6.3) Requirement already satisfied: jinja2>=2.11.3 in a:\comfyui\python\lib\site-packages (from llama-cpp-python) (3.1.2) Requirement already satisfied: MarkupSafe>=2.0 in a:\comfyui\python\lib\site-packages (from jinja2>=2.11.3->llama-cpp-python) (2.1.3) Building wheels for collected packages: llama-cpp-python Running command Building wheel for llama-cpp-python (pyproject.toml) scikit-build-core 0.10.5 using CMake 3.30.2 (wheel) *** Configuring CMake... 2024-08-29 23:36:38,488 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None loading initial cache file C:\Users\User\AppData\Local\Temp\tmpfhziza\build\CMakeInit.txt -- Building for: Visual Studio 17 2022 -- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.19045. -- The C compiler identification is MSVC 19.41.34120.0 -- The CXX compiler identification is MSVC 19.41.34120.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.41.34120/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.41.34120/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.1.windows.1") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Looking for pthread_create in pthreads -- Looking for pthread_create in pthreads - not found -- Looking for pthread_create in pthread -- Looking for pthread_create in pthread - not found -- Found Threads: TRUE -- Found OpenMP_C: -openmp (found version "2.0") -- Found OpenMP_CXX: -openmp (found version "2.0") -- Found OpenMP: TRUE (found version "2.0") -- OpenMP found -- Using llamafile -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: x64 -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed CMake Warning (dev) at CMakeLists.txt:9 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. Call Stack (most recent call first): CMakeLists.txt:73 (llama_cpp_python_install_target) This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:17 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. Call Stack (most recent call first): CMakeLists.txt:73 (llama_cpp_python_install_target) This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:9 (install): Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. Call Stack (most recent call first): CMakeLists.txt:74 (llama_cpp_python_install_target) This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:17 (install): Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. Call Stack (most recent call first): CMakeLists.txt:74 (llama_cpp_python_install_target) This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring done (58.0s) -- Generating done (0.4s) -- Build files have been written to: C:/Users/User/AppData/Local/Temp/tmpfhziza/build *** Building project with Visual Studio 17 2022... Change Dir: 'C:/Users/User/AppData/Local/Temp/tmpfhziza/build'

Run Build Command(s): "C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n MSBuild version 17.11.2+c078802d4 for .NET Framework Build started 29/08/2024 11:37:37 PM.

Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" on node 1 (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj" (2) on node 1 (default targets). PrepareForBuild: Creating directory "x64\Release\ZERO_CHECK\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\". InitializeBuildStatus: Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". CustomBuild: 1>Checking Build System FinalizeBuildStatus: Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild". Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj" (3) on node 1 (default targets). PrepareForBuild: Creating directory "build_info.dir\Release\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "build_info.dir\Release\build_info.tlog\". InitializeBuildStatus: Creating "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". CustomBuild: Generating build details from Git -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.1.windows.1") Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/common/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"build_info.dir\Release\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\common\build-info.cpp" build-info.cpp Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64 /machine:x64 "build_info.dir\Release\build-info.obj" build_info.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib FinalizeBuildStatus: Deleting file "build_info.dir\Release\build_info.tlog\unsuccessfulbuild". Touching "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) on node 1 (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (5) on node 1 (default targets). C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\bin\Release\". Creating directory "ggml.dir\Release\ggml.tlog\". InitializeBuildStatus: Creating "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "ggml.dir\Release\ggml.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/ggml/src/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /std:c11 /Fo"ggml.dir\Release\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TC /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml.c" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-alloc.c" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-backend.c" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-quants.c" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-aarch64.c" ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c ggml-aarch64.c Generating Code... C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj] (compiling source file '../../../../../../pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/ggml/src/ggml-quants.c') C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-common.h(62,9): see previous definition of 'static_assert'

C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj] (compiling source file '../../../../../../pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/ggml/src/ggml-aarch64.c') C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\ggml-common.h(62,9): see previous definition of 'static_assert'

C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D GGML_SHARED /D GGML_BUILD /D _CRT_SECURE_NO_WARNINGS /D GGML_SCHED_MAX_COPIES=4 /D GGML_USE_OPENMP /D GGML_USE_LLAMAFILE /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D ggml_EXPORTS /EHsc /MD /GS /arch:AVX2 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /openmp /Fo"ggml.dir\Release\\" /Fd"ggml.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src\llamafile\sgemm.cpp"
sgemm.cpp

MakeDirsForLink: Creating directory "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\Release\". PreLinkEvent: Auto build dll exports setlocal cd C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src if %errorlevel% neq 0 goto :cmEnd C: if %errorlevel% neq 0 goto :cmEnd C:\Users\User\AppData\Local\Temp\pip-build-env-276ourug\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/ggml.dir/Release//objects.txt if %errorlevel% neq 0 goto :cmEnd :cmEnd endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone :cmErrorLevel exit /b %1 :cmDone if %errorlevel% neq 0 goto :VCEnd :VCEnd Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\bin\Release\ggml.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /DEF:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/ggml.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/bin/Release/ggml.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/Release/ggml.lib" /MACHINE:X64 /machine:x64 /DLL ggml.dir\Release\ggml.obj "ggml.dir\Release\ggml-alloc.obj" "ggml.dir\Release\ggml-backend.obj" "ggml.dir\Release\ggml-quants.obj" ggml.dir\Release\sgemm.obj "ggml.dir\Release\ggml-aarch64.obj" Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/Release/ggml.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/ggml/src/Release/ggml.exp ggml.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\bin\Release\ggml.dll FinalizeBuildStatus: Deleting file "ggml.dir\Release\ggml.tlog\unsuccessfulbuild". Touching "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (4) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj" (6) on node 1 (default targets). C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llama.dir\Release\llama.tlog\". InitializeBuildStatus: Creating "llama.dir\Release\llama.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama.dir\Release\llama.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/src/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\llama.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\llama-vocab.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\llama-grammar.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\llama-sampling.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\unicode.cpp" "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src\unicode-data.cpp" llama.cpp llama-vocab.cpp llama-grammar.cpp llama-sampling.cpp unicode.cpp unicode-data.cpp Generating Code...

Orenji-Tangerine commented 2 months ago

It stops at Generating Code...

Orenji-Tangerine commented 2 months ago

But this time, with this --verbose error reporting, it works? Though it prompts with some warnings. Safe to say nothing to worry? Below is where it continues from Generating Code.

Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\Release\common.lib" /NOLOGO /MACHINE:X64 /machine:x64 common.dir\Release\common.obj common.dir\Release\sampling.obj common.dir\Release\console.obj "common.dir\Release\grammar-parser.obj" "common.dir\Release\json-schema-to-grammar.obj" common.dir\Release\train.obj "common.dir\Release\ngram-cache.obj" "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.dir\Release\build-info.obj" common.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\Release\common.lib FinalizeBuildStatus: Deleting file "common.dir\Release\common.tlog\unsuccessfulbuild". Touching "common.dir\Release\common.tlog\common.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (8) on node 1 (default targets). PrepareForBuild: Creating directory "llama-llava-cli.dir\Release\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\". Creating directory "llama-llava-cli.dir\Release\llama-llava-cli.tlog\". InitializeBuildStatus: Creating "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\common." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava...." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava....\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama-llava-cli.dir\Release\" /Fd"llama-llava-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\llava-cli.cpp" llava-cli.cpp Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-llava-cli.exe" /INCREMENTAL:NO /NOLOGO ....\common\Release\common.lib ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.lib" /MACHINE:X64 /machine:x64 "llama-llava-cli.dir\Release\llava-cli.obj" C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.exp llama-llava-cli.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-llava-cli.exe FinalizeBuildStatus: Deleting file "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild". Touching "llama-llava-cli.dir\Release\llama-llava-cli.tlog\llama-llava-cli.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (9) on node 1 (default targets). PrepareForBuild: Creating directory "llama-minicpmv-cli.dir\Release\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\". InitializeBuildStatus: Creating "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\common." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava...." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava....\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D "CMAKE_INTDIR=\"Release\"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama-minicpmv-cli.dir\Release\" /Fd"llama-minicpmv-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\minicpmv-cli.cpp" minicpmv-cli.cpp Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-minicpmv-cli.exe" /INCREMENTAL:NO /NOLOGO ....\common\Release\common.lib ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.lib" /MACHINE:X64 /machine:x64 "llama-minicpmv-cli.dir\Release\minicpmv-cli.obj" C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.exp llama-minicpmv-cli.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-minicpmv-cli.exe FinalizeBuildStatus: Deleting file "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild". Touching "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\llama-minicpmv-cli.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (10) on node 1 (default targets). PrepareForBuild: Creating directory "llava_shared.dir\Release\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_shared.dir\Release\llava_shared.tlog\". InitializeBuildStatus: Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.exp llava_shared.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava.dll FinalizeBuildStatus: Deleting file "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". Touching "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (11) on node 1 (default targets). PrepareForBuild: Creating directory "llava_static.dir\Release\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_static.dir\Release\llava_static.tlog\". InitializeBuildStatus: Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj llava_static.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib FinalizeBuildStatus: Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets). PrepareForBuild: Creating directory "x64\Release\ALL_BUILD\". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "x64\Release\ALL_BUILD\ALL_BUILD.tlog\". InitializeBuildStatus: Creating "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/CMakeLists.txt FinalizeBuildStatus: Deleting file "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild". Touching "x64\Release\ALL_BUILD\ALL_BUILD.tlog\ALL_BUILD.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default targets).

Build succeeded.

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj" (default target) (2) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj" (default target) (3) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) -> (ClCompile target) -> C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj] C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (6) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> (ClCompile target) -> C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(1037,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(1465,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(2620,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj" (default target) (7) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (default target) (8) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (default target) (9) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (10) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default target) (11) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj]

  16 Warning(s)
  0 Error(s)

Time Elapsed 00:16:39.80

Installing project into wheel... -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/ggml.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/ggml.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-alloc.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-backend.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-blas.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cann.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cuda.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-kompute.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-metal.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-rpc.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-sycl.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-vulkan.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/ggml.lib -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/ggml.dll -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-alloc.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-backend.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-blas.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cann.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cuda.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-kompute.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-metal.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-rpc.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-sycl.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-vulkan.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/llama.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/llama.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/cmake/llama/llama-config.cmake -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/cmake/llama/llama-version.cmake -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/convert_hf_to_gguf.py -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/pkgconfig/llama.pc -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.dll -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/llava.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llava.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama-llava-cli.exe -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama-minicpmv-cli.exe -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llava.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llava.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llava.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llava.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll Making wheel... *** Created llama_cpp_python-0.2.90-cp310-cp310-win_amd64.whl Building wheel for llama-cpp-python (pyproject.toml) ... done Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.90-cp310-cp310-win_amd64.whl size=3139563 sha256=eb0c43773f835ca48a89ab5e11ffc4f656fe2ee168d22e6db5410cc26f97849b Stored in directory: c:\users\User\appdata\local\pip\cache\wheels\3d\67\02\f950031435db4a5a02e6269f6adb6703bf1631c3616380f3c6 Successfully built llama-cpp-python Installing collected packages: llama-cpp-python Successfully installed llama-cpp-python-0.2.90

abetlen commented 2 months ago

@Orenji-Tangerine yes those warnings shouldn't impact the installation. Glad to hear it worked eventually.

I found this link that might be useful: https://stackoverflow.com/a/27862596

In the linked discussion it seems that visual studio may be performing some aggressive link-time optimization, it's bizarre that this isn't an issue in the llama.cpp installation however.

abetlen commented 2 months ago

Closing as duplicate of #1703 moving the discussion there

chozillla commented 2 weeks ago

But this time, with this --verbose error reporting, it works? Though it prompts with some warnings. Safe to say nothing to worry? Below is where it continues from Generating Code.

Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\Release\common.lib" /NOLOGO /MACHINE:X64 /machine:x64 common.dir\Release\common.obj common.dir\Release\sampling.obj common.dir\Release\console.obj "common.dir\Release\grammar-parser.obj" "common.dir\Release\json-schema-to-grammar.obj" common.dir\Release\train.obj "common.dir\Release\ngram-cache.obj" "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.dir\Release\build-info.obj" common.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\Release\common.lib FinalizeBuildStatus: Deleting file "common.dir\Release\common.tlog\unsuccessfulbuild". Touching "common.dir\Release\common.tlog\common.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (8) on node 1 (default targets). PrepareForBuild: Creating directory "llama-llava-cli.dir\Release". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release". Creating directory "llama-llava-cli.dir\Release\llama-llava-cli.tlog". InitializeBuildStatus: Creating "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\common." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava...." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava....\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D "CMAKE_INTDIR="Release"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama-llava-cli.dir\Release\" /Fd"llama-llava-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\llava-cli.cpp" llava-cli.cpp Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-llava-cli.exe" /INCREMENTAL:NO /NOLOGO ....\common\Release\common.lib ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.lib" /MACHINE:X64 /machine:x64 "llama-llava-cli.dir\Release\llava-cli.obj" C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-llava-cli.exp llama-llava-cli.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-llava-cli.exe FinalizeBuildStatus: Deleting file "llama-llava-cli.dir\Release\llama-llava-cli.tlog\unsuccessfulbuild". Touching "llama-llava-cli.dir\Release\llama-llava-cli.tlog\llama-llava-cli.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (9) on node 1 (default targets). PrepareForBuild: Creating directory "llama-minicpmv-cli.dir\Release". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog". InitializeBuildStatus: Creating "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\CL.exe /c /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\common." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\ggml\src..\include" /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava...." /I"C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava....\common" /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D "CMAKE_INTDIR="Release"" /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama-minicpmv-cli.dir\Release\" /Fd"llama-minicpmv-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\minicpmv-cli.cpp" minicpmv-cli.cpp Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-minicpmv-cli.exe" /INCREMENTAL:NO /NOLOGO ....\common\Release\common.lib ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.lib" /MACHINE:X64 /machine:x64 "llama-minicpmv-cli.dir\Release\minicpmv-cli.obj" C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llama-minicpmv-cli.exp llama-minicpmv-cli.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llama-minicpmv-cli.exe FinalizeBuildStatus: Deleting file "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\unsuccessfulbuild". Touching "llama-minicpmv-cli.dir\Release\llama-mi.973590C3.tlog\llama-minicpmv-cli.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (10) on node 1 (default targets). PrepareForBuild: Creating directory "llava_shared.dir\Release". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_shared.dir\Release\llava_shared.tlog". InitializeBuildStatus: Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/User/AppData/Local/Temp/tmpfhziza/build/vendor/llama.cpp/examples/llava/Release/llava.exp llava_shared.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava.dll FinalizeBuildStatus: Deleting file "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". Touching "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets). Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (11) on node 1 (default targets). PrepareForBuild: Creating directory "llava_static.dir\Release". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_static.dir\Release\llava_static.tlog". InitializeBuildStatus: Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/vendor/llama.cpp/examples/llava/CMakeLists.txt Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj llava_static.vcxproj -> C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib FinalizeBuildStatus: Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets). PrepareForBuild: Creating directory "x64\Release\ALL_BUILD". C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj] Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "x64\Release\ALL_BUILD\ALL_BUILD.tlog". InitializeBuildStatus: Creating "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/CMakeLists.txt FinalizeBuildStatus: Deleting file "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild". Touching "x64\Release\ALL_BUILD\ALL_BUILD.tlog\ALL_BUILD.lastbuildstate". Done Building Project "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default targets).

Build succeeded.

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj" (default target) (2) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ZERO_CHECK.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj" (default target) (3) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\build_info.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) -> (ClCompile target) -> C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj] C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\ggml\src\ggml.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (6) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\src\llama.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) -> (ClCompile target) -> C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(1037,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(1465,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj] C:\Users\User\AppData\Local\Temp\pip-install-_21clzop\llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53\vendor\llama.cpp\examples\llava\clip.cpp(2620,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj" (default target) (7) -> (PrepareForBuild target) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\common\common.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj" (default target) (8) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-llava-cli.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj" (default target) (9) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llama-minicpmv-cli.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default target) (10) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> "C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default target) (11) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj]

"C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj" (default target) (1) -> C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\Microsoft.CppBuild.targets(541,5): warning MSB8029: The Intermediate directory or Output directory cannot reside under the Temporary directory as it could lead to issues with incremental build. [C:\Users\User\AppData\Local\Temp\tmpfhziza\build\ALL_BUILD.vcxproj]

  16 Warning(s)
  0 Error(s)

Time Elapsed 00:16:39.80

Installing project into wheel... -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/ggml.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/ggml.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-alloc.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-backend.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-blas.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cann.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cuda.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-kompute.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-metal.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-rpc.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-sycl.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-vulkan.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/ggml.lib -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/ggml.dll -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-alloc.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-backend.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-blas.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cann.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-cuda.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-kompute.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-metal.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-rpc.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-sycl.h -- Up-to-date: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/ggml-vulkan.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/llama.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/include/llama.h -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/cmake/llama/llama-config.cmake -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/cmake/llama/llama-version.cmake -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/convert_hf_to_gguf.py -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/pkgconfig/llama.pc -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.dll -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/lib/llava.lib -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llava.dll -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama-llava-cli.exe -- Installing: C:\Users\User\AppData\Local\Temp\tmpfhziza\wheel\platlib/bin/llama-minicpmv-cli.exe -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llava.lib -- Installing: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llava.dll -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llava.lib -- Installing: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llava.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/llama.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/pip-install-_21clzop/llama-cpp-python_ea515b3b7ddf491eb1357e0ecbfc1c53/llama_cpp/lib/ggml.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/llama.dll -- Up-to-date: C:/Users/User/AppData/Local/Temp/tmpfhziza/wheel/platlib/llama_cpp/lib/ggml.dll Making wheel... *** Created llama_cpp_python-0.2.90-cp310-cp310-win_amd64.whl Building wheel for llama-cpp-python (pyproject.toml) ... done Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.90-cp310-cp310-win_amd64.whl size=3139563 sha256=eb0c43773f835ca48a89ab5e11ffc4f656fe2ee168d22e6db5410cc26f97849b Stored in directory: c:\users\User\appdata\local\pip\cache\wheels\3d\67\02\f950031435db4a5a02e6269f6adb6703bf1631c3616380f3c6 Successfully built llama-cpp-python Installing collected packages: llama-cpp-python Successfully installed llama-cpp-python-0.2.90

This took 16 minutes to complile?