ggerganov / llama.cpp

LLM inference in C/C++
MIT License
65.85k stars 9.45k forks source link

Bug: ABI problem in binary file "llama-b3187-bin-win-msvc-arm64.zip" #8050

Open Billzhong2022 opened 3 months ago

Billzhong2022 commented 3 months ago

What happened?

In release tag https://github.com/ggerganov/llama.cpp/releases/tag/b3187, file "llama-cli.exe" in binary file "llama-b3187-bin-win-msvc-arm64.zip" is windows X64 ABI. It's not Windows ARM64 ABI. What is the reason? Why you mark it to "win-msvc-arm64"?

Logs: C:\llama-b3187-bin-win-msvc-arm64>dumpbin /headers llama-cli.exe FILE HEADER VALUES 8664 machine (x64)

And llama-cli.exe depends below four libraries. Where should I download them?

libstdc++-6.dll libwinpthread-1.dll libgcc_s_seh-1.dll libgomp-1.dll

Name and Version

Tag b3187

What operating system are you seeing the problem on?

No response

Relevant log output

N/A
Billzhong2022 commented 3 months ago

Hi LLama team,

Below steps can build llama Windows ARM64 binary with MSVC toolchain.

  1. Download and install the tools.

Download the latest CMake Windows ARM64 installer from https://cmake.org/download/ and install it. Download Visual Studio 2022 and install it.

  1. From the Windows command prompt, go to the code path named “llama.cpp” and execute the following commands to build the binary files:

mkdir build cd build cmake .. -A ARM64 cmake --build . --config Release