immersive-web / depth-sensing

Specification: https://immersive-web.github.io/depth-sensing/ Explainer: https://github.com/immersive-web/depth-sensing/blob/main/explainer.md
Other
53 stars 15 forks source link

How to estimate focal length of an android phone ToF camera? #36

Open akssieg opened 2 years ago

akssieg commented 2 years ago

I need to estimate the focal length of the ToF camera for the calculation of depth surface normal. I was going through the documentation of "WebXR Depth Sensing Module" but I couldn't find any info regarding the ToF intrinsic or field of view.

Any comments and suggestions will be appreciated!!!

AdaRoseCannon commented 2 years ago

If you need the normal for only a single point, the hit-test API will give it to you as the orientation.

akssieg commented 2 years ago

@AdaRoseCannon Initially I was trying the hit test api but it performs very poorly on vertical walls and the roof. I think it first estimate a plane then a hit test point on that plane but I want to estimate the normal of a point on a surface using the surface gradients and I need either field of view or focal length to do that.

akssieg commented 2 years ago

Is Depthinfo.normDepthBufferFromNormView a projection matrix? Can I use it to find field of view, focal length, etc?

bialpio commented 2 years ago

Is Depthinfo.normDepthBufferFromNormView a projection matrix? Can I use it to find field of view, focal length, etc?

You should be able to find projection matrix in XRView, please see XRView.projectionMatrix. We also have a sample computation of camera intrinsics (including focal length) with hopefully fairly detailed derivation here.

akssieg commented 2 years ago

@bialpio I am actually little confused here. Is that projection matrix for RGB camera or ToF camera?

klausw commented 2 years ago

@bialpio wrote:

You should be able to find projection matrix in XRView, please see XRView.projectionMatrix. We also have a sample computation of camera intrinsics (including focal length) with hopefully fairly detailed derivation here.

The link for the intrinsics calculation didn't work right, try this one instead.

@akssieg wrote:

@bialpio I am actually little confused here. Is that projection matrix for RGB camera or ToF camera?

If I'm understanding it right, the depth buffer is associated to a specific XRView, and it should be safe to assume that its view geometry matches that XRView. If it's associated with the RGB camera's XRView, the XR device needs to ensure that the views are cropped to get a single effective view even if the underlying camera fields of view are different. So in that case, using calculations for the RGB camera view would also apply to the depth camera view after using normDepthBufferFromNormView to do coordinate transforms.

I think this implies that there may be a minor mismatch for extrinsics if the RGB camera and depth camera aren't in the exact same location on the device. In case that's a significant difference, it would be necessary to use separate XRViews for the two cameras, but I'm unsure if that's a good fit to the current WebXR APIs.