hhao / ollama

Get up and running with Llama 3, Mistral, Gemma, and other large language models.
https://ollama.com
MIT License
6 stars 0 forks source link

gcc_libinit_windows.c:143:27: error: implicit declaration of function ‘_beginthread’; did you mean ‘_cgo_beginthread’? [-Werror=implicit-functio n-declaration] #4

Open insinfo opened 2 months ago

insinfo commented 2 months ago

What is the issue?

error when trying to compile for windows on MSYS2

cd /c/
cd my_cpp_projects/

git clone -b minicpm-v2.5 https://github.com/OpenBMB/ollama.git
cd ollama/llm
git clone -b minicpm-v2.5 https://github.com/OpenBMB/llama.cpp.git
cd ../

pacman -Syu

pacman -S base-devel gcc vim cmake

pacman -S git

pacman -S mingw-w64-x86_64-go

export GOPATH=/mingw64
export GOROOT=/mingw64/lib/go
export CGO_ENABLED="1"

pacman -S mingw-w64-x86_64-make

go version
go version go1.22.5 windows/amd64

 gcc --version
gcc (GCC) 13.3.0
Copyright (C) 2023 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
suporte@DESKTOP-HGLUJLD MINGW64 /c/my_cpp_projects/ollama
$ go generate ./...
go: downloading github.com/google/uuid v1.1.2
go: downloading golang.org/x/crypto v0.23.0
go: downloading github.com/containerd/console v1.0.3
go: downloading github.com/mattn/go-runewidth v0.0.14
go: downloading github.com/olekukonko/tablewriter v0.0.5
go: downloading github.com/spf13/cobra v1.7.0
go: downloading golang.org/x/term v0.20.0
go: downloading github.com/d4l3k/go-bfloat16 v0.0.0-20211005043715-690c3bdd05f1
go: downloading github.com/nlpodyssey/gopickle v0.3.0
go: downloading github.com/pdevine/tensor v0.0.0-20240510204454-f88f4562727c
go: downloading github.com/x448/float16 v0.8.4
go: downloading google.golang.org/protobuf v1.34.1
go: downloading github.com/gin-gonic/gin v1.10.0
go: downloading github.com/emirpasic/gods v1.18.1
go: downloading github.com/gin-contrib/cors v1.7.2
go: downloading github.com/rivo/uniseg v0.2.0
go: downloading github.com/inconshreveable/mousetrap v1.1.0
go: downloading github.com/spf13/pflag v1.0.5
go: downloading golang.org/x/text v0.15.0
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/apache/arrow/go/arrow v0.0.0-20211112161151-bc219186db40
go: downloading github.com/chewxy/hm v1.0.0
go: downloading go4.org/unsafe/assume-no-moving-gc v0.0.0-20231121144256-b99613f794b6
go: downloading github.com/google/flatbuffers v24.3.25+incompatible
go: downloading github.com/chewxy/math32 v1.10.1
go: downloading gonum.org/v1/gonum v0.15.0
go: downloading gorgonia.org/vecf32 v0.9.0
go: downloading gorgonia.org/vecf64 v0.9.0
go: downloading github.com/gin-contrib/sse v0.1.0
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading golang.org/x/net v0.25.0
go: downloading golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/golang/protobuf v1.5.4
go: downloading github.com/xtgo/set v1.0.0
go: downloading github.com/go-playground/validator/v10 v10.20.0
go: downloading github.com/pelletier/go-toml/v2 v2.2.2
go: downloading github.com/ugorji/go/codec v1.2.12
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/gabriel-vasile/mimetype v1.4.3
go: downloading github.com/go-playground/universal-translator v0.18.1
go: downloading github.com/leodido/go-urn v1.4.0
go: downloading github.com/go-playground/locales v0.14.1
Already on 'minicpm-v2.5'
Your branch is up to date with 'origin/minicpm-v2.5'.
Submodule path '../llama.cpp': checked out '65f7455cea443bd9b6fd8546ef53440d6f6d963f'
Checking for MinGW...

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Application     gcc.exe                                            0.0.0.0    C:\msys64\usr\bin\gcc.exe
Application     mingw32-make.exe                                   0.0.0.0    C:\msys64\mingw64\bin\mingw32-make.exe
Building static library
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64_static -G MinGW Makefiles -DCMAKE_C_COMPILER=gcc.exe -DCMAKE_CXX_COMPILE
R=g++.exe -DBUILD_SHARED_LIBS=off -DLLAMA_NATIVE=off -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_F16C=off -DLLAMA_FMA=off
cmake version 3.30.1

CMake suite maintained and supported by Kitware (kitware.com/cmake).
CMake Error: Could not create named generator MinGW Makefiles

Generators
* Unix Makefiles               = Generates standard UNIX makefiles.
  Ninja                        = Generates build.ninja files.
  Ninja Multi-Config           = Generates build-<Config>.ninja files.
  CodeBlocks - Ninja           = Generates CodeBlocks project files
                                 (deprecated).
  CodeBlocks - Unix Makefiles  = Generates CodeBlocks project files
                                 (deprecated).
  CodeLite - Ninja             = Generates CodeLite project files
                                 (deprecated).
  CodeLite - Unix Makefiles    = Generates CodeLite project files
                                 (deprecated).
  Eclipse CDT4 - Ninja         = Generates Eclipse CDT 4.0 project files
                                 (deprecated).
  Eclipse CDT4 - Unix Makefiles= Generates Eclipse CDT 4.0 project files
                                 (deprecated).
  Kate - Ninja                 = Generates Kate project files (deprecated).
  Kate - Ninja Multi-Config    = Generates Kate project files (deprecated).
  Kate - Unix Makefiles        = Generates Kate project files (deprecated).
  Sublime Text 2 - Ninja       = Generates Sublime Text 2 project files
                                 (deprecated).
  Sublime Text 2 - Unix Makefiles
                               = Generates Sublime Text 2 project files
                                 (deprecated).

llm\generate\generate_windows.go:3: running "powershell": exit status 1

suporte@DESKTOP-HGLUJLD MINGW64 /c/my_cpp_projects/ollama
$ go build .
# runtime/cgo
gcc_libinit_windows.c: In function ‘_cgo_beginthread’:
gcc_libinit_windows.c:143:27: error: implicit declaration of function ‘_beginthread’; did you mean ‘_cgo_beginthread’? [-Werror=implicit-functio
n-declaration]
  143 |                 thandle = _beginthread(func, 0, arg);
      |                           ^~~~~~~~~~~~
      |                           _cgo_beginthread
cc1: all warnings being treated as errors

suporte@DESKTOP-HGLUJLD MINGW64 /c/my_cpp_projects/ollama

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

No response

insinfo commented 2 months ago

in poweshel it is giving this error

PS C:\my_cpp_projects\ollama> $env:CGO_ENABLED="1"
PS C:\my_cpp_projects\ollama> go generate ./...
Submodule 'llama.cpp' (https://github.com/ggerganov/llama.cpp.git) registered for path '../llama.cpp'
M       llm/llama.cpp
Already on 'minicpm-v2.5'
Your branch is up to date with 'origin/minicpm-v2.5'.
Submodule path '../llama.cpp': checked out '65f7455cea443bd9b6fd8546ef53440d6f6d963f'
Checking for MinGW...

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Application     gcc.exe                                            0.0.0.0    C:\mingw64\bin\gcc.exe
Application     mingw32-make.exe                                   0.0.0.0    C:\mingw64\bin\mingw32-make.exe
Building static library
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64_static -G MinGW Makefiles -DCMAKE_C_COMPILER=gcc.exe -DCMAKE_CXX_COMPILER=g++.exe -DBUILD_SHARED_LIBS=off -DLLAMA_NATIVE=off -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_F16C=off -DLLAMA_FMA=off
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- The C compiler identification is GNU 14.2.0
-- The CXX compiler identification is GNU 14.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/mingw64/bin/gcc.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/mingw64/bin/g++.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.46.0.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- OpenMP found
-- ccache found, compilation results will be cached. Disable with LLAMA_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- x86 detected
-- Configuring done (6.4s)
-- Generating done (1.9s)
-- Build files have been written to: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static
building with: cmake --build ../build/windows/amd64_static --config Release --target llama --target ggml
[  0%] Building C object CMakeFiles/ggml.dir/ggml.c.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\ggml.c:84:8: warning: type qualifiers ignored on function return type [-Wignored-qualifiers]
   84 | static atomic_bool atomic_flag_test_and_set(atomic_flag * ptr) {
      |        ^~~~~~~~~~~
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-alloc.c.obj
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-backend.c.obj
[ 33%] Building C object CMakeFiles/ggml.dir/ggml-quants.c.obj
[ 50%] Building CXX object CMakeFiles/ggml.dir/sgemm.cpp.obj
[ 50%] Built target ggml
[ 66%] Building CXX object CMakeFiles/llama.dir/llama.cpp.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In member function 'std::string llama_file::GetErrorMessageWin32(DWORD) const':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1319:46: warning: format '%s' expects argument of type 'char*', but argument 2 has type 'DWORD' {aka 'long unsigned int'} [-Wformat=]
 1319 |             ret = format("Win32 error code: %s", error_code);
      |                                             ~^   ~~~~~~~~~~
      |                                              |   |
      |                                              |   DWORD {aka long unsigned int}
      |                                              char*
      |                                             %ld
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In constructor 'llama_mmap::llama_mmap(llama_file*, size_t, bool)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1657:38: warning: cast between incompatible function types from 'FARPROC' {aka 'long long int (*)()'} to 'BOOL (*)(HANDLE, ULONG_PTR, PWIN32_MEMORY_RANGE_ENTRY, ULONG)' {aka 'int (*)(void*, long long unsigned int, _WIN32_MEMORY_RANGE_ENTRY*, long unsigned int)'} [-Wcast-function-type]
 1657 |             pPrefetchVirtualMemory = reinterpret_cast<decltype(pPrefetchVirtualMemory)> (GetProcAddress(hKernel32, "PrefetchVirtualMemory"));
      |                                      ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_logits_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18512:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18512 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_embeddings_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18557:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18557 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode.cpp.obj
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode-data.cpp.obj
[100%] Linking CXX static library libllama.a
[100%] Built target llama
[100%] Built target ggml
Building LCD CPU
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64/cpu -DCMAKE_POSITION_INDEPENDENT_CODE=on -A x64 -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -DBUILD_SHARED_LIBS=on -DLLAMA_NATIVE=off -DLLAMA_SERVER_VERBOSE=off -DCMAKE_BUILD_TYPE=Release
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- Building for: Ninja
CMake Error at CMakeLists.txt:2 (project):
  Generator

    Ninja

  does not support platform specification, but platform

    x64

  was specified.

CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
llm\generate\generate_windows.go:3: running "powershell": exit status 1
PS C:\my_cpp_projects\ollama>
insinfo commented 2 months ago

I edited the CMakeLists.txt file and added the lines to define the compiler path and then I got this other error

C:\my_cpp_projects\ollama\llm\llama.cpp\CMakeLists.txt

cmake_minimum_required(VERSION 3.14) # for add_link_options and implicit target directories.

set( CMAKE_CXX_COMPILER "C:/mingw64/bin/g++.exe" )
set( CMAKE_C_COMPILER "C:/mingw64/bin/gcc.exe" )

project("llama.cpp" C CXX)
include(CheckIncludeFileCXX)
PS C:\my_cpp_projects\ollama> go build .
# github.com/ollama/ollama
C:\Program Files\Go\pkg\tool\windows_amd64\link.exe: running gcc failed: exit status 1
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0xbd0): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x32ba): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x869d): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x12dc8): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x1319f): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x15a84): more undefined references to `GOMP_barrier' follow
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x400d9): undefined reference to `GOMP_single_start'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x400e2): undefined reference to `omp_get_num_threads'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x400f0): undefined reference to `GOMP_barrier'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x400fe): undefined reference to `omp_get_thread_num'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x45263): undefined reference to `GOMP_parallel'
C:/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/14.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static/libllama.a(ggml.c.obj):ggml.c:(.text+0x454e2): undefined reference to `GOMP_parallel'
collect2.exe: error: ld returned 1 exit status
insinfo commented 2 months ago

I tried again with another mingw distribution

https://nuwen.net/mingw.html

PS C:\my_cpp_projects> git clone -b minicpm-v2.5 https://github.com/OpenBMB/ollama.git
Cloning into 'ollama'...
remote: Enumerating objects: 15715, done.
remote: Counting objects: 100% (192/192), done.
remote: Compressing objects: 100% (151/151), done.
remote: Total 15715 (delta 77), reused 100 (delta 36), pack-reused 15523
Receiving objects: 100% (15715/15715), 8.34 MiB | 8.47 MiB/s, done.
Resolving deltas: 100% (10032/10032), done.
PS C:\my_cpp_projects> cd ollama/llm
PS C:\my_cpp_projects\ollama\llm> git clone -b minicpm-v2.5 https://github.com/OpenBMB/llama.cpp.git
Cloning into 'llama.cpp'...
remote: Enumerating objects: 21989, done.
remote: Counting objects: 100% (6049/6049), done.
remote: Compressing objects: 100% (447/447), done.
remote: Total 21989 (delta 5816), reused 5648 (delta 5601), pack-reused 15940
Receiving objects: 100% (21989/21989), 41.78 MiB | 19.19 MiB/s, done.
Resolving deltas: 100% (15736/15736), done.
PS C:\my_cpp_projects\ollama\llm> cd ../
PS C:\my_cpp_projects\ollama> go generate ./...
Submodule 'llama.cpp' (https://github.com/ggerganov/llama.cpp.git) registered for path '../llama.cpp'
M       llm/llama.cpp
Already on 'minicpm-v2.5'
Your branch is up to date with 'origin/minicpm-v2.5'.
Submodule path '../llama.cpp': checked out '65f7455cea443bd9b6fd8546ef53440d6f6d963f'
Checking for MinGW...

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Application     gcc.exe                                            0.0.0.0    C:\mingw64\bin\gcc.exe
Application     mingw32-make.exe                                   0.0.0.0    C:\mingw64\bin\mingw32-make.exe
Building static library
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64_static -G MinGW Makefiles -DCMAKE_C_COMPILER=gcc.exe -DCMAKE_CXX_COMPILER=g++.exe -DBUILD_SHARED_LIBS=off -DLLAMA_NATIVE=off -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_F16C=off -DLLAMA_FMA=off
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- The C compiler identification is GNU 13.2.0
-- The CXX compiler identification is GNU 13.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/mingw64/bin/gcc.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/mingw64/bin/g++.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.46.0.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- OpenMP found
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- x86 detected
-- Configuring done (6.1s)
-- Generating done (2.1s)
-- Build files have been written to: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static
building with: cmake --build ../build/windows/amd64_static --config Release --target llama --target ggml
[  0%] Building C object CMakeFiles/ggml.dir/ggml.c.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\ggml.c:84:8: warning: type qualifiers ignored on function return type [-Wignored-qualifiers]
   84 | static atomic_bool atomic_flag_test_and_set(atomic_flag * ptr) {
      |        ^~~~~~~~~~~
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-alloc.c.obj
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-backend.c.obj
[ 33%] Building C object CMakeFiles/ggml.dir/ggml-quants.c.obj
[ 50%] Building CXX object CMakeFiles/ggml.dir/sgemm.cpp.obj
[ 50%] Built target ggml
[ 66%] Building CXX object CMakeFiles/llama.dir/llama.cpp.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In member function 'std::string llama_file::GetErrorMessageWin32(DWORD) const':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1319:46: warning: format '%s' expects argument of type 'char*', but argument 2 has type 'DWORD' {aka 'long unsigned int'} [-Wformat=]
 1319 |             ret = format("Win32 error code: %s", error_code);
      |                                             ~^   ~~~~~~~~~~
      |                                              |   |
      |                                              |   DWORD {aka long unsigned int}
      |                                              char*
      |                                             %ld
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In constructor 'llama_mmap::llama_mmap(llama_file*, size_t, bool)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1657:38: warning: cast between incompatible function types from 'FARPROC' {aka 'long long int (*)()'} to 'BOOL (*)(HANDLE, ULONG_PTR, PWIN32_MEMORY_RANGE_ENTRY, ULONG)' {aka 'int (*)(void*, long long unsigned int, _WIN32_MEMORY_RANGE_ENTRY*, long unsigned int)'} [-Wcast-function-type]
 1657 |             pPrefetchVirtualMemory = reinterpret_cast<decltype(pPrefetchVirtualMemory)> (GetProcAddress(hKernel32, "PrefetchVirtualMemory"));
      |                                      ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_logits_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18512:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18512 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_embeddings_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18557:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18557 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode.cpp.obj
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode-data.cpp.obj
[100%] Linking CXX static library libllama.a
[100%] Built target llama
[100%] Built target ggml
Building LCD CPU
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64/cpu -DCMAKE_POSITION_INDEPENDENT_CODE=on -A x64 -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -DBUILD_SHARED_LIBS=on -DLLAMA_NATIVE=off -DLLAMA_SERVER_VERBOSE=off -DCMAKE_BUILD_TYPE=Release
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.19045.
-- The C compiler identification is MSVC 19.40.33813.0
-- The CXX compiler identification is MSVC 19.40.33813.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.46.0.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- OpenMP found
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM: x64
-- x86 detected
-- Configuring done (36.2s)
-- Generating done (1.2s)
CMake Warning:
  Manually-specified variables were not used by the project:

    LLAMA_F16C

-- Build files have been written to: C:/my_cpp_projects/ollama/llm/build/windows/amd64/cpu
building with: cmake --build ../build/windows/amd64/cpu --config Release --target ollama_llama_server
Versão do MSBuild 17.10.4+10fbfbf2e para .NET Framework
MSBUILD : error MSB1009: Arquivo de projeto não existe.
Opção: ollama_llama_server.vcxproj
llm\generate\generate_windows.go:3: running "powershell": exit status 1
PS C:\my_cpp_projects\ollama>