Open hoonkai opened 5 years ago
The IR sensor is actually two sensors that pick up a pattern of point lights from the ir projector. You can find the distance between the two IR sensors either in the documentation or in some ros topic, through code etc.
Generally you would want to align the two images, the one from the depth and rgb camera, so that after aligning the frames you would only need to use the fx,fy,cx,cy of the unchanged camera stream. The depth stream has already combined the two ir sensor images into a depth map of the area.
After aligning the streams using the realsense library, you can ask it for the intrinsics of the camera and you get the correct parameters.
@DimTrigkakis How would you ask the realsense library for the camera intrinsics? I'm using the RealSense D435
@collinabidi If this is still an issue, you can do it through the command line with Intel.Realsense.CustomRW -sn ##-SerialNumber## -r -f ##FileName## This was from the "Intel® RealSense™ Product Family D400 Series Calibration Tools" document - Chapter 6
Issue Description:
Clarification Needed:
Hi
I'm trying to find the value of
Camera.bf
for my RGBD camera. I believe the value is meant to be "IR projector baseline times fx (aprox.)", but how is the IR projector baseline measured? Is it the distance between the IR projector and the IR camera or is it between the RGB camera and the IR camera?Also, I presume
fx
here is thefx
of the RGB camera, but aren'tfx
,fy
,cx
andcy
of the IR camera need to be taken into account somehow?Any help will be much appreciated.