vitoplantamura / OnnxStream

Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK.
https://yolo.vitoplantamura.com/
Other
1.86k stars 84 forks source link

(Not an issue) WASM Branch: How to build? #84

Closed JVGA8837 closed 2 months ago

JVGA8837 commented 2 months ago

I've noticed there's a new wasm branch in the repo, is there a way to build and test it yet or is it still too early in development to test?

vitoplantamura commented 2 months ago

hi @JVGA8837,

you can easily build it with Bazel, however within a week I plan to finish the development and merge that branch into master. I developed a very nice demo with yolov8n, which I haven't committed yet, to demonstrate the use and performance of OnnxStream for wasm!

Vito

JVGA8837 commented 2 months ago

That sounds great, i'll be sure to check it out when it's finished!