ngxson / wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
371 stars 18 forks source link

Failed to build from scratch: llamacpp-wasm-builder, CMake Error (add_executable): Cannot find source file #76

Closed flatsiedatsie closed 3 months ago

flatsiedatsie commented 3 months ago

This is with

# git submodule update --remote --merge

Screenshot 2024-06-26 at 18 55 39

Perhaps related to this change with llama.cpp: https://github.com/ggerganov/llama.cpp/commit/f3f65429c44bb195a9195bfdc19a30a79709db7b

flatsiedatsie commented 3 months ago

Awesome! Thank you!