Open Nazaninpdk opened 1 month ago
I am projecting images to 3D also and having problems with spatial alignment, more specifically between camera views. I suspect it is due to it only accepting horizontal focal length but I may be wrong
I have the same issue. In red is the predicted depth and blue is the actual depth from the sensor.
@Nazaninpdk Quick question: If you re-scale the pointcloud so the 57cm goes to 29cm, does the distance from the other objects in your scene reflect the real-world distance as well?
Would you mind sharing your pointcloud for me to do some measurements?
I'm interested to know if DepthPro estimation, although not correct to real-world-scale, is correct between the elements in the scene. If the distance between objects is correct after scaling the pointcloud to the correct real-world-scale, then DepthPro scale could be "fixed" using real depthmap from iPhones truedepth/lidar.
Hello. Thank you for your great work. I am investigating on monocular depth estimation and I need to use them for size measurements. I used your code for estimating the depth. Visually the colored depth is perfect and all the details are visible. But When I use the RGB image + Depth using open3D ( I have my own Fx Fy Cx Cy), the results are sometimes not as perfect as the estimated depth. Especially When I want to measure the size of objects in the point clouds, it seems they are not correct.
My RGN image:
Estimated Depth using Depth Pro:
The point Cloud:
size: 57 cm while it should be 29 cm