maekawatoshiki / altius

Small ONNX inference runtime written in Rust
MIT License
91 stars 6 forks source link
deep-neural-networks onnx rust wasm

Altius

CI Coverage
Small ONNX inference runtime written in Rust.
Feel free to create issues and discussions!

Requirements

Run

# Download models.
(cd models && ./download.sh)
# Download minimum models.
# (cd models && ./download.sh CI)

# Run examples.
# {mnist, mobilenet, deit, vit} are available.
# You can specify the number of threads for computation by editing the code.
cargo run --release --example mnist
cargo run --release --example mobilenet
cargo run --release --example deit
cargo run --release --example vit

# Experimental CPU backend (that generates code in C)
cargo run --release --example mnist_cpu     -- --iters 10 
cargo run --release --example mobilenet_cpu -- --iters 10 --profile
cargo run --release --example deit_cpu      -- --iters 10 --threads 8 --profile

Run from WebAssembly

Currently, mobilenet v3 runs on web browsers.

cd wasm
cargo install wasm-pack
wasm-pack build --target web
yarn
yarn serve

Run from Python

cd ./crates/altius_py
uv sync
uv run maturin develop -r
uv run python mobilenet.py