WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
371
stars
18
forks
source link
Failed to build from scratch: llamacpp-wasm-builder, CMake Error (add_executable): Cannot find source file #76
Closed
flatsiedatsie closed 3 months ago
This is with
# git submodule update --remote --merge
Perhaps related to this change with llama.cpp: https://github.com/ggerganov/llama.cpp/commit/f3f65429c44bb195a9195bfdc19a30a79709db7b