microsoft / onnxruntime-inference-examples

Examples for using ONNX Runtime for machine learning inferencing.
MIT License
1.08k stars 312 forks source link

[New Sample] Inference of Depth-Anything on Android #383

Open shubham0204 opened 5 months ago

shubham0204 commented 5 months ago

Depth-Anything is a recent advancement in monocular depth estimation which leverages large unlabeled datasets combined with semi-supervised training and DINOv2 as a semantic-encoder.

The authors provide weights of three versions of the model, available on HuggingFace as PyTorch Modules. The repository fabio-sim/Depth-Anything-ONNX has scripts which convert the model weights to ONNX models. Using these ONNX models as the base, I've fused pre/postprocessing operations in it to create a single ONNX model which takes a RGB image as input and outputs a depth-map.

This model is used for inference in an Android app, using onnxruntime's Android libraries, and further applies the Inferno colormap to the depth-map output. Here's the Android app: shubham0204/Depth-Anything-Android

repo_banner

The Android app showcases many abilities of the ONNX and onnxruntime. Inclusion of this sample in onnxruntime-inference-examples would be of great help to app developers.

TouqeerAhmad commented 3 months ago

@shubham0204 great work, have you explored any form of INT8 quantization for DepthAnything?