vitoplantamura / OnnxStream

Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK.
https://yolo.vitoplantamura.com/
Other
1.86k stars 84 forks source link

Added "How to Convert and Run a Custom Stable Diffusion Model" #36

Closed GaelicThunder closed 1 year ago

GaelicThunder commented 1 year ago

Added "How to Convert and Run a Custom Stable Diffusion Model" guide based on this issue

vitoplantamura commented 1 year ago

hi,

I made a couple of fixes to the text that I will add in a later commit.

Thanks, Vito