abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
7.73k stars 931 forks source link

Can't get GPU offloading to work #1112

Open Gitclop opened 8 months ago

Gitclop commented 8 months ago

I followed the readme, but i can't get llama-cpp to run on my 4090.

set CMAKE_ARGS=-DLLAMA_CUBLAS=on
set FORCE_CMAKE=1
pip install llama-cpp-python  --upgrade --force-reinstall --no-cache-dir --verbose

I am running python 3.10, latest cuda drivers and tried different llama-cpp verisons. In generell the gpu offloading works on my system (sentence-transformers runs perfectly on gpu) only llama-cpp is giving me trouble.

Running command Building wheel for llama-cpp-python (pyproject.toml)
  *** scikit-build-core 0.7.1 using CMake 3.28.1 (wheel)
  *** Configuring CMake...
  2024-01-21 12:17:46,994 - scikit_build_core - WARNING - Can't find a Python library, got libdir=None, ldlibrary=None, multiarch=None, masd=None
  loading initial cache file C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\build\CMakeInit.txt
  -- Building for: Visual Studio 17 2022
  -- The C compiler identification is MSVC 19.38.33134.0
  -- The CXX compiler identification is MSVC 19.38.33134.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.38.33130/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.39.1.windows.1")
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- CMAKE_SYSTEM_PROCESSOR: AMD64
  -- CMAKE_GENERATOR_PLATFORM: x64
  -- x86 detected
  -- Performing Test HAS_AVX_1
  -- Performing Test HAS_AVX_1 - Success
  -- Performing Test HAS_AVX2_1
  -- Performing Test HAS_AVX2_1 - Success
  -- Performing Test HAS_FMA_1
  -- Performing Test HAS_FMA_1 - Success
  -- Performing Test HAS_AVX512_1
  -- Performing Test HAS_AVX512_1 - Success
  CMake Warning (dev) at CMakeLists.txt:21 (install):
    Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  This warning is for project developers.  Use -Wno-dev to suppress it.

  CMake Warning (dev) at CMakeLists.txt:30 (install):
    Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  This warning is for project developers.  Use -Wno-dev to suppress it.

  -- Configuring done (4.6s)
  -- Generating done (0.1s)
  -- Build files have been written to: C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build
  *** Building project with Visual Studio 17 2022...
  Change Dir: 'C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build'

  Run Build Command(s): "C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/MSBuild/Current/Bin/amd64/MSBuild.exe" ALL_BUILD.vcxproj /p:Configuration=Release /p:Platform=x64 /p:VisualStudioVersion=17.0 /v:n
  MSBuild-Version 17.8.5+b5265ef37 f\x81r .NET Framework
  Der Buildvorgang wurde am 21.01.2024 12:17:51 gestartet.

  Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" auf Knoten "1" (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ZERO_CHECK.vcxproj" (2) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "x64\Release\ZERO_CHECK\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\" wird erstellt.
  InitializeBuildStatus:
    "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
  CustomBuild:
    1>Checking Build System
  FinalizeBuildStatus:
    Die Datei "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ZERO_CHECK.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\build_info.vcxproj" (3) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "build_info.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "build_info.dir\Release\build_info.tlog\" wird erstellt.
  InitializeBuildStatus:
    "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "build_info.dir\Release\build_info.tlog\unsuccessfulbuild".
  CustomBuild:
    Generating build details from Git
    -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.39.1.windows.1")
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/common/CMakeLists.txt
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBC
S /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:f
orScope /Zc:inline /Fo"build_info.dir\Release\\" /Fd"build_info.dir\Release\build_info.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\build-info.cpp"
    build-info.cpp
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"build_info.dir\Release\build_info.lib" /NOLOGO /MACHINE:X64  /machine:x64 "build_info.dir\Release\build-info.obj"
    build_info.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\build_info.dir\Release\build_info.lib
  FinalizeBuildStatus:
    Die Datei "build_info.dir\Release\build_info.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "build_info.dir\Release\build_info.tlog\build_info.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\build_info.vcxproj" ist abgeschlossen (Standardziele).      
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.vcxproj" (4) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "ggml.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "ggml.dir\Release\ggml.tlog\" wird erstellt.
  InitializeBuildStatus:
    "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "ggml.dir\Release\ggml.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/CMakeLists.txt     
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\...\AppData\Local\Temp\pip-instal
l-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _C
RT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /std:c11 /Fo"ggml.dir\Rel
ease\\" /Fd"ggml.dir\Release\ggml.pdb" /external:W1 /Gd /TC /errorReport:queue "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14d
a0b8efa01c3563d155\vendor\llama.cpp\ggml.c" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp
\ggml-alloc.c" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\ggml-backend.c" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\ggml-quants.c"
    ggml.c
    ggml-alloc.c
    ggml-backend.c
    ggml-quants.c
    Code wird generiert...
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"ggml.dir\Release\ggml.lib" /NOLOGO /MACHINE:X64  /machine:x64 ggml.dir\Release\ggml.obj
    "ggml.dir\Release\ggml-alloc.obj"
    "ggml.dir\Release\ggml-backend.obj"
    "ggml.dir\Release\ggml-quants.obj"
    ggml.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml.lib
  FinalizeBuildStatus:
    Die Datei "ggml.dir\Release\ggml.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "ggml.dir\Release\ggml.tlog\ggml.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (5) auf Knoten "1" (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (5) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\llama.vcxproj" (6) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\bin\Release\" wird erstellt.
    Das Verzeichnis "llama.dir\Release\llama.tlog\" wird erstellt.
  InitializeBuildStatus:
    "llama.dir\Release\llama.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "llama.dir\Release\llama.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/CMakeLists.txt     
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\...\AppData\Local\Temp\pip-instal
l-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _WINDLL /D _MBCS /D WIN32 /D _WINDOWS /D N
DEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D _CRT_SECURE_NO_WARNINGS /D _XOPEN_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /D llama_EXPORTS /Gm- /EHsc /MD /GS /arch:AVX512 /fp:prec
ise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llama.dir\Release\\" /Fd"llama.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\llama.cpp"
    llama.cpp
  C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\llama.cpp(2943,69): warning C4566: Das dur
ch den universellen Zeichennamen "\u010A" dargestellte Zeichen kann in der aktuellen Codepage (1252) nicht dargestellt werden. [C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\llama.vcxproj]
  MakeDirsForLink:
    Das Verzeichnis "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\Release\" wird erstellt.
  PreLinkEvent:
    Auto build dll exports
    setlocal
    cd C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp
    if %errorlevel% neq 0 goto :cmEnd
    C:
    if %errorlevel% neq 0 goto :cmEnd
    C:\Users\...\AppData\Local\Temp\pip-build-env-7b7ebcgu\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/llama.dir/Release/exports.def C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/llama.dir/Release//objects.txt 
    if %errorlevel% neq 0 goto :cmEnd
    :cmEnd
    endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
    :cmErrorLevel
    exit /b %1
    :cmDone
    if %errorlevel% neq 0 goto :VCEnd
    :VCEnd
  Link:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\...\AppData\
Local\Temp\tmpj4ezs7xu\build\bin\Release\llama.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdl
g32.lib advapi32.lib /DEF:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/llama.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"level='asInvoke
r' uiAccess='false'" /manifest:embed /PDB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/bin/Release/llama.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/llama.lib" /MACHINE:X64  /machine:x64 /DLL llama.dir\Release\llama.obj       
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj"
       Bibliothek "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/llama.lib" und Objekt "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/llama.exp" werden erstellt.
    llama.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\bin\Release\llama.dll
  FinalizeBuildStatus:
    Die Datei "llama.dir\Release\llama.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "llama.dir\Release\llama.tlog\llama.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\llama.vcxproj" ist abgeschlossen (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "llava.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "llava.dir\Release\llava.tlog\" wird erstellt.
  InitializeBuildStatus:
    "llava.dir\Release\llava.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "llava.dir\Release\llava.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/examples/llava/CMakeLists.txt
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\...\AppData\Local\Temp\pip-instal
l-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-py
thon_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\..\.." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14d
a0b8efa01c3563d155\vendor\llama.cpp\examples\llava\..\..\common" /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c35
63d155\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D LLAMA_SHARED /D LLAMA_BUILD /D "CMAKE_INTDIR=\"Release\"
" /Gm- /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"llava.dir\Release\\" /Fd"llava.dir\Release\llava.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\U
sers\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\llava.cpp" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\clip.cpp"
    llava.cpp
    clip.cpp
  C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\clip.cpp(481,9): warning C4
297: "clip_model_load": Die Funktion l”st eine unerwartete Ausnahme aus [C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
    C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\clip.cpp(481,9):
    __declspec(nothrow), throw(), noexcept(true) oder noexcept wurde f\x81r die Funktion angegeben

    Code wird generiert...
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"llava.dir\Release\llava.lib" /NOLOGO /MACHINE:X64  /machine:x64 llava.dir\Release\llava.obj
    llava.dir\Release\clip.obj
    llava.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.lib
  FinalizeBuildStatus:
    Die Datei "llava.dir\Release\llava.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "llava.dir\Release\llava.tlog\llava.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj" ist abgeschlossen (Standardziele).   
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\common.vcxproj" (7) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "common.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\Release\" wird erstellt.
    Das Verzeichnis "common.dir\Release\common.tlog\" wird erstellt.
  InitializeBuildStatus:
    "common.dir\Release\common.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "common.dir\Release\common.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/common/CMakeLists.txt
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\...\AppData\Local\Temp\pip-instal
l-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b
8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\." /nologo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D _CRT_SECURE_NO_WARNINGS /D _XOPEN
_SOURCE=600 /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /arch:AVX512 /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /Fo"common.dir\Release\\" /Fd"C:\Users\...
\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\Release\common.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\...\AppData\Local\Temp\pip-insta
ll-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\common.cpp" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-py
thon_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\sampling.cpp" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b
8efa01c3563d155\vendor\llama.cpp\common\console.cpp" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\
llama.cpp\common\grammar-parser.cpp" "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\train.cpp"
    common.cpp
    sampling.cpp
    console.cpp
    grammar-parser.cpp
    train.cpp
    Code wird generiert...
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\Release\common.lib" /NOLOGO /MACHINE:X64  /machine:x64 common.dir\Release\common.obj
    common.dir\Release\sampling.obj
    common.dir\Release\console.obj
    "common.dir\Release\grammar-parser.obj"
    common.dir\Release\train.obj
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\build_info.dir\Release\build-info.obj"
    common.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\Release\common.lib
  FinalizeBuildStatus:
    Die Datei "common.dir\Release\common.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "common.dir\Release\common.tlog\common.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\common\common.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml_shared.vcxproj" (8) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "ggml_shared.dir\Release\ggml_shared.tlog\" wird erstellt.
  InitializeBuildStatus:
    "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/CMakeLists.txt     
  PreLinkEvent:
    Auto build dll exports
    setlocal
    cd C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp
    if %errorlevel% neq 0 goto :cmEnd
    C:
    if %errorlevel% neq 0 goto :cmEnd
    C:\Users\...\AppData\Local\Temp\pip-build-env-7b7ebcgu\normal\Lib\site-packages\cmake\data\bin\cmake.exe -E __create_def C:/Users/.../AppData/Local/Temp
/tmpj4ezs7xu/build/vendor/llama.cpp/ggml_shared.dir/Release/exports.def C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/ggml_shared.dir/Release//objects.txt
    if %errorlevel% neq 0 goto :cmEnd
    :cmEnd
    endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
    :cmErrorLevel
    exit /b %1
    :cmDone
    if %errorlevel% neq 0 goto :VCEnd
    :VCEnd
  Link:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\...\AppData\
Local\Temp\tmpj4ezs7xu\build\bin\Release\ggml_shared.dll" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib
 comdlg32.lib advapi32.lib /DEF:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/ggml_shared.dir/Release/exports.def" /MANIFEST /MANIFESTUAC:"lev
el='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/bin/Release/ggml_shared.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /D
YNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/ggml_shared.lib" /MACHINE:X64  /machine:x64 /DLL C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj"
       Bibliothek "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/ggml_shared.lib" und Objekt "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/Release/ggml_shared.exp" werden erstellt.
    ggml_shared.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\bin\Release\ggml_shared.dll
  FinalizeBuildStatus:
    Die Datei "ggml_shared.dir\Release\ggml_shared.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "ggml_shared.dir\Release\ggml_shared.tlog\ggml_shared.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml_shared.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml_static.vcxproj" (9) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "ggml_static.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "ggml_static.dir\Release\ggml_static.tlog\" wird erstellt.
  InitializeBuildStatus:
    "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/CMakeLists.txt     
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\...\AppData\Local\Temp\tmpj4ezs7
xu\build\vendor\llama.cpp\Release\ggml_static.lib" /NOLOGO /MACHINE:X64  /machine:x64 C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj"
    ggml_static.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\Release\ggml_static.lib
  FinalizeBuildStatus:
    Die Datei "ggml_static.dir\Release\ggml_static.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "ggml_static.dir\Release\ggml_static.tlog\ggml_static.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml_static.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" (10) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "llava-cli.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\" wird erstellt.
    Das Verzeichnis "llava-cli.dir\Release\llava-cli.tlog\" wird erstellt.
  InitializeBuildStatus:
    "llava-cli.dir\Release\llava-cli.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "llava-cli.dir\Release\llava-cli.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/examples/llava/CMakeLists.txt
  ClCompile:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\CL.exe /c /I"C:\Users\...\AppData\Local\Temp\pip-instal
l-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\common\." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b
8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\l
lama.cpp\examples\llava\." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava
\..\.." /I"C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\..\..\common" /nol
ogo /W1 /WX- /diagnostics:column /O2 /Ob2 /D _MBCS /D WIN32 /D _WINDOWS /D NDEBUG /D "CMAKE_INTDIR=\"Release\"" /Gm- /EHsc /MD /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:
inline /Fo"llava-cli.dir\Release\\" /Fd"llava-cli.dir\Release\vc143.pdb" /external:W1 /Gd /TP /errorReport:queue "C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\llava-cli.cpp"
    llava-cli.cpp
  Link:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\...\AppData\
Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\llava-cli.exe" /INCREMENTAL:NO /NOLOGO ..\..\common\Release\common.lib ..\..\Release\llama.lib kernel32
.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /ma
nifest:embed /PDB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava-cli.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE
 /NXCOMPAT /IMPLIB:"C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava-cli.lib" /MACHINE:X64  /machine:x64 "llava-cli.dir\Release\llava-cli.obj"
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
       Bibliothek "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava-cli.lib" und Objekt "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava-cli.exp" werden erstellt.
    llava-cli.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\llava-cli.exe
  FinalizeBuildStatus:
    Die Datei "llava-cli.dir\Release\llava-cli.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "llava-cli.dir\Release\llava-cli.tlog\llava-cli.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava-cli.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (11) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "llava_shared.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "llava_shared.dir\Release\llava_shared.tlog\" wird erstellt.
  InitializeBuildStatus:
    "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/examples/llava/CMakeLists.txt
  Link:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\...\AppData\
Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ..\..\Release\llama.lib kernel32.lib user32.lib gdi32.lib winspool.l
ib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/.
uin/AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/...
/AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64  /machine:x64 /DLL C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml.obj
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-alloc.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-backend.obj"
    "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\ggml.dir\Release\ggml-quants.obj"
       Bibliothek "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava.lib" und Objekt "C:/Users/.../AppData/Local/Temp/tmpj4ezs7xu/build/vendor/llama.cpp/examples/llava/Release/llava.exp" werden erstellt.
    llava_shared.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\llava.dll
  FinalizeBuildStatus:
    Die Datei "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" ist abgeschlossen (Standardziele).
  Das Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (1) erstellt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (12) auf Knoten "1" (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "llava_static.dir\Release\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "llava_static.dir\Release\llava_static.tlog\" wird erstellt.
  InitializeBuildStatus:
    "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/vendor/llama.cpp/examples/llava/CMakeLists.txt
  Lib:
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.38.33130\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\...\AppData\Local\Temp\tmpj4ezs7
xu\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64  /machine:x64 C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj
    C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj
    llava_static.vcxproj -> C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib
  FinalizeBuildStatus:
    Die Datei "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" ist abgeschlossen (Standardziele).
  PrepareForBuild:
    Das Verzeichnis "x64\Release\ALL_BUILD\" wird erstellt.
    Die strukturierte Ausgabe ist aktiviert. Die Formatierung der Compilerdiagnose spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie unter https://aka.ms/cpp/structured-output.
    Das Verzeichnis "x64\Release\ALL_BUILD\ALL_BUILD.tlog\" wird erstellt.
  InitializeBuildStatus:
    "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild" wird erstellt, da "AlwaysCreate" angegeben wurde.
    Aktualisieren des Timestamps von "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild".
  CustomBuild:
    Building Custom Rule C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/CMakeLists.txt
  FinalizeBuildStatus:
    Die Datei "x64\Release\ALL_BUILD\ALL_BUILD.tlog\unsuccessfulbuild" wird gel”scht.
    Aktualisieren des Timestamps von "x64\Release\ALL_BUILD\ALL_BUILD.tlog\ALL_BUILD.lastbuildstate".
  Die Erstellung von Projekt "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" ist abgeschlossen (Standardziele).

  Der Buildvorgang wurde erfolgreich ausgef\x81hrt.

  "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (Standardziel) (1) ->
  "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (Standardziel) (5) ->
  "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\llama.vcxproj" (Standardziel) (6) ->
  (ClCompile Ziel) ->
    C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\llama.cpp(2943,69): warning C4566: Das d
urch den universellen Zeichennamen "\u010A" dargestellte Zeichen kann in der aktuellen Codepage (1252) nicht dargestellt werden. [C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\llama.vcxproj]

  "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\ALL_BUILD.vcxproj" (Standardziel) (1) ->
  "C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (Standardziel) (5) ->
    C:\Users\...\AppData\Local\Temp\pip-install-b0xe1k4f\llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155\vendor\llama.cpp\examples\llava\clip.cpp(481,9): warning 
C4297: "clip_model_load": Die Funktion l”st eine unerwartete Ausnahme aus [C:\Users\...\AppData\Local\Temp\tmpj4ezs7xu\build\vendor\llama.cpp\examples\llava\llava.vcxproj]

      2 Warnung(en)
      0 Fehler

  Verstrichene Zeit 00:00:13.24

  *** Installing project into wheel...
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/lib/ggml_shared.lib
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/ggml_shared.dll
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/lib/cmake/Llama/LlamaConfig.cmake
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/lib/cmake/Llama/LlamaConfigVersion.cmake
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/include/ggml.h
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/lib/llama.lib
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/llama.dll
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/include/llama.h
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/convert.py
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/convert-lora-to-ggml.py
  -- Installing: C:/Users/~1./AppData/Local/Temp/tmpj4ezs7xu/wheel/platlib/llama_cpp/llama.lib
  -- Installing: C:/Users/~1./AppData/Local/Temp/tmpj4ezs7xu/wheel/platlib/llama_cpp/llama.dll
  -- Installing: C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/llama_cpp/llama.lib
  -- Installing: C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/llama_cpp/llama.dll
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/lib/llava.lib
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/llava.dll
  -- Installing: C:\Users\~1.\AppData\Local\Temp\tmpj4ezs7xu\wheel\platlib/bin/llava-cli.exe
  -- Installing: C:/Users/~1./AppData/Local/Temp/tmpj4ezs7xu/wheel/platlib/llama_cpp/llava.lib
  -- Installing: C:/Users/~1./AppData/Local/Temp/tmpj4ezs7xu/wheel/platlib/llama_cpp/llava.dll
  -- Installing: C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/llama_cpp/llava.lib
  -- Installing: C:/Users/.../AppData/Local/Temp/pip-install-b0xe1k4f/llama-cpp-python_24b8d3705ce14da0b8efa01c3563d155/llama_cpp/llava.dll
  *** Making wheel...
  *** Created llama_cpp_python-0.2.27-cp310-cp310-win_amd64.whl...
  Building wheel for llama-cpp-python (pyproject.toml) ... done
  Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.27-cp310-cp310-win_amd64.whl size=1897618 sha256=d2bab45373b37d3092669a05518e2f7b0396b3d61f3563489eb68da969803189
  Stored in directory: C:\Users\...\AppData\Local\Temp\pip-ephem-wheel-cache-zy06y9ww\wheels\8c\92\37\ada3fcfdf537bab790219920443164923e6cbfcbd80174af23
Successfully built llama-cpp-python
Installing collected packages: typing-extensions, numpy, diskcache, llama-cpp-python
  Attempting uninstall: typing-extensions
    Found existing installation: typing_extensions 4.9.0
    Uninstalling typing_extensions-4.9.0:
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\__pycache__\typing_extensions.cpython-310.pyc
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\typing_extensions-4.9.0.dist-info\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\typing_extensions.py
      Successfully uninstalled typing_extensions-4.9.0
  Attempting uninstall: numpy
    Found existing installation: numpy 1.26.3
    Uninstalling numpy-1.26.3:
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\numpy-1.26.3-cp310-cp310-win_amd64.whl
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\numpy-1.26.3.dist-info\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\numpy.libs\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\numpy\
      Removing file or directory c:\users\...\venv\my_venv\scripts\f2py.exe
      Successfully uninstalled numpy-1.26.3
  WARNING: Failed to remove contents in a temporary directory 'C:\Users\...\venv\my_venv\Lib\site-packages\~umpy.libs'.
  You can safely remove it manually.
  WARNING: Failed to remove contents in a temporary directory 'C:\Users\...\venv\my_venv\Lib\site-packages\~-mpy'.
  You can safely remove it manually.
  Attempting uninstall: diskcache
    Found existing installation: diskcache 5.6.3
    Uninstalling diskcache-5.6.3:
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\diskcache-5.6.3.dist-info\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\diskcache\
      Successfully uninstalled diskcache-5.6.3
  Attempting uninstall: llama-cpp-python
    Found existing installation: llama_cpp_python 0.2.27
    Uninstalling llama_cpp_python-0.2.27:
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\bin\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\include\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\lib\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\llama_cpp\
      Removing file or directory c:\users\...\venv\my_venv\lib\site-packages\llama_cpp_python-0.2.27.dist-info\
      Successfully uninstalled llama_cpp_python-0.2.27
  WARNING: Failed to remove contents in a temporary directory 'C:\Users\...\venv\my_venv\Lib\site-packages\~lama_cpp'.
  You can safely remove it manually.
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
numba 0.58.0 requires numpy<1.26,>=1.21, but you have numpy 1.26.3 which is incompatible.
Gitclop commented 8 months ago

Found my Error. I use Pycharm with a virtual environment. Setting the environment variables from the terminal did not work, but setting them inside the pycharm settings >Tools > Terminal did the trick.

yousecjoe commented 8 months ago

Found my Error. I use Pycharm with a virtual environment. Setting the environment variables from the terminal did not work, but setting them inside the pycharm settings >Tools > Terminal did the trick.

Setting environment variables does work. This could be an end-user error. You may not be activating the virtual environment from inside the terminal.

python -m virtualenv XXX
.\XXX\Scripts\activate
python -m pip install --upgrade poetry
python -m poetry install --with ui,local
python -m poetry install --extras chroma
python -m poetry run python scripts/setup
$env:CMAKE_ARGS='-DLLAMA_CUBLAS=on'; poetry run pip install llama-cpp-python --force-reinstall --no-cache-dir
make run