-
Hello,
I have recently downloaded the Depth-Anything-V2-Small model from the provided link on GitHub. According to the documentation, the model size is listed as 24.8M. However, upon downloading, I…
-
Hi, thank you for your great work.
I want to know the evaluations of depth anything, like AbsRel is based on relative depth or metric depth?
-
HelloI'm seeing a discrepancy between PyTorch and ONNX depth anything v2 small models.
Does anyone have any solution for this issue?
Thanks
-
Hello! This work is very impressive and shows great potential. May I ask when the Depth-Anything-V2-Giant 1.3B model will be released? I am very eager for its release!
-
What steps would you recommend taking for using the small depth anything relative depth estimation model with the metric depth estimation pipeline? Will it need to be re-trained or should I be able t…
-
Is there a way I could use the metric depth anything model for nyu with the hugginface pipeline for other depth anything models
-
In the prediction tab, there's specification for input : Input image whose depth will be estimated. The shorter dimension should be 518 pixels and the larger dimension should be a multiple of 14, but …
x4080 updated
3 months ago
-
This would be super useful to me :)
Paper:
[https://arxiv.org/pdf/2406.09414](https://arxiv.org/pdf/2406.09414)
Models:
https://huggingface.co/spaces/depth-anything/Depth-Anything-V2/blob/main…
-
I am trying to understand the code. For depth prediction part, scaling and shifting are added when using Depth Anything v1 and MiDaS. For Depth Anything v1, the constants are
```Python
scale = 0.030…
-
**Details of model being requested**
- Model name: Depth-Anything
- Source repo link: https://github.com/LiheYoung/Depth-Anything
- Research paper link [If applicable]: https://arxiv.org/abs/2401…