Closed fire closed 1 year ago
The information was in the whispher.cpp repo
This WASM port utilizes WASM SIMD 128-bit intrinsics so you have to make sure that your browser supports them.
- https://github.com/ggerganov/whisper.cpp/issues/585
- https://github.com/ggerganov/whisper.cpp/issues/103
- W3C WebNN
- Spec: https://w3.org/TR/webnn/
- Web: https://webmachinelearning.github.io/webnn-intro/
- WASI-NN
- Spec: https://github.com/WebAssembly/wasi-nn
- https://github.com/ggerganov/llama.cpp/issues/97
- https://github.com/ggerganov/whisper.cpp/issues/428 (neat demo idea)
make \
CC=emcc \
CXX=em++ \
LLAMA_NO_ACCELERATE=1 \
CFLAGS="\
-DNDEBUG \
-s MEMORY64" \
CXXFLAGS="\
-DNDEBUG \
-s MEMORY64" \
LDFLAGS="\
-s MEMORY64 \
-s FORCE_FILESYSTEM=1 \
-s EXPORT_ES6=1 \
-s MODULARIZE=1 \
-s TOTAL_MEMORY=2GB \
-s STACK_SIZE=524288 \
-s ALLOW_MEMORY_GROWTH \
-s EXPORTED_FUNCTIONS=_main \
-s EXPORTED_RUNTIME_METHODS=callMain \
-s BUILD_AS_WORKER=1 \
-s SINGLE_FILE=1 \
-s NO_EXIT_RUNTIME=1" \
main.js
[ ] ggml, llama.cpp, and whisper.cpp could be built with/by emscripten-forge?
The existing emscripten-forge package definitions are in emscripten-forge/recipes//recipes/recipes_emscripten: https://github.com/emscripten-forge/recipes/tree/main/recipes/recipes_emscripten
- [ ] ggml, llama.cpp, and whisper.cpp could be built with/by emscripten-forge? The existing emscripten-forge package definitions are in emscripten-forge/recipes//recipes/recipes_emscripten: https://github.com/emscripten-forge/recipes/tree/main/recipes/recipes_emscripten
Thanks for sharing. As the issue had been closed, do you found the solution to build llama.cpp to WebAssembly?
It sounds obvious that we can compile ggml into wasm with WebGPU or Webgl2 support.
Has anyone done it and written about how to do it?
Alternatively, one can assist with some idea how to do it.