ggerganov / llama.cpp

LLM inference in C/C++
MIT License
61.14k stars 8.73k forks source link

Feature Request: Why is there no pre-compiled Windows version of AMD ROCm? #8169

Open wangzi7654321 opened 5 days ago

wangzi7654321 commented 5 days ago

Prerequisites

Feature Description

Feature Request: Why is there no pre-compiled Windows version of AMD ROCm? b3248

Motivation

Some users' computers are unable to compile llama.cpp

Possible Implementation

No response

henk717 commented 5 days ago

If it helps the KoboldCpp ROCm versions are precompiled for Windows : https://github.com/YellowRoseCx/koboldcpp-rocm/releases May carry you over until official binaries are provided.