Parallel Tracking and Mapping (PTAM) is a camera tracking system for Augmented Reality. It requires no markers, pre-made maps, known templates, or inertial sensors.
I was able to compute the intrinsic calibration matrix K
There are a few takeaways:
Camera2 api supports a method to get the intrinsic parameters f_x, f_y, c_x, c_y, and s by calling characteristics.get(CameraCharacteristics.LENS_INTRINSIC_CALIBRATION). The possible drawback of this function is that not all devices cameras support it.
There's a fallback to approximate the calibration matrix K with focal length characteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS), and using image width and height. We can assume focal length x and y are the same f_x = f_y = focalLength. And also assume that center image is c_x = width / 2, c_y = height / 2. We assume no distortions in the camera.
I was able to compute the intrinsic calibration matrix K
There are a few takeaways:
characteristics.get(CameraCharacteristics.LENS_INTRINSIC_CALIBRATION)
. The possible drawback of this function is that not all devices cameras support it.characteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS)
, and using image width and height. We can assume focal length x and y are the same f_x = f_y = focalLength. And also assume that center image is c_x = width / 2, c_y = height / 2. We assume no distortions in the camera.