-
I want to train your model on my own dataset, which has RGBA images + relative depth maps( 8-bit)
Is this possible ?
Here is an example of my data:
RGB Image:
![shot_0003_source_0000](https…
-
Hello, thank you very much for the great work. I have already tested the new depth map feature on several datasets based on the depth maps from depth anything v2. However, significantly more floaters …
-
Hello, great work!
Can you please clarify which DepthAnything and DepthAnything-V2 models are used for comparison in Table1 of the paper?
Also there is no detail on the inference speed of the mod…
-
Hi everyone, thanks once again for the work.
So far I've been able to train NeRF with Instant NGP by using also depth supervision, obtaining fairly good results.
When I prepare the depth maps fo…
-
I used your model on several videos (to be more precise on several sequences of images) and the quality and the resolution is very impressive. The temporal consistency is usually also good but not in …
-
Dear Haoyu Ma, Thank you very much for the work I really like your paper.
I have one question regarding your dataset though, In your dataset, you only have defocused maps and not depth maps. My ques…
-
Hi
It is unclear to me how you calculate $\delta_1$ and $AbsRel$ metrics in the zero-shot relative depth setting.
Are you using the normalized disparity maps $d$, the affine-invariant disparity…
-
**Describe the bug**
Visualization tool outputs wrong depth for: "create_depth_maps_from_gempy()"
The following line of code I believe is to blame:
1690: # A…
-
Hi,
In the dist_train.sh file for training , what min_depth and max_depth values should I use ? I'm training the model of my own custom dataset that has 8-bit relative depth maps.
the default valu…
-
So I realized reshade's opengl hooks work with angrylion and have been having fun playing around with shaders (scanlines anyone?) however it seems like a lot of shaders reshade offers don't work becau…