LiheYoung / Depth-Anything

[CVPR 2024] Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data. Foundation Model for Monocular Depth Estimation
https://depth-anything.github.io
Apache License 2.0
7.01k stars 539 forks source link

DepthAnything example for the web and without any servers... #228

Closed akbartus closed 1 month ago

akbartus commented 2 months ago

Hi!

I just wanted to take a moment to thank you for the fantastic repository and model. I’ve been experimenting with DepthAnything and have developed a browser-based implementation of it, powered by ONNX.

You can check it out here:

Github Repo: https://github.com/akbartus/DepthAnything-on-Browser Interactive Demo: https://depthanything.glitch.me/

Feel free to share it on your page if you think it might be of interest!

LiheYoung commented 2 months ago

Hi @akbartus, thank you very much for your contribution! I will add this to our README in the next update.

akbartus commented 2 months ago

Thank you. Feel free to close the issue.

NikolasE commented 2 months ago

That's a nice tool! Which metric model are you running? It would also be nice to be able to pass the focal length as a parameter instead of using a fixed fov of 45.

akbartus commented 2 months ago

That's a nice tool! Which metric model are you running? It would also be nice to be able to pass the focal length as a parameter instead of using a fixed fov of 45.

@NikolasE thanks for the feedback. In the example it is small quantized model. Will update example to include interactive sliders for FOV and others.

akbartus commented 1 month ago

@NikolasE created example with sliders, you can check it here: https://depthanything.glitch.me/webgpu-sliders.html but for DepthAnything v2 already. Updated repo accordingly.