IntelRealSense / librealsense

Intel® RealSense™ SDK
https://www.intelrealsense.com/
Apache License 2.0
7.6k stars 4.83k forks source link

Align T265 pose with compass #3805

Closed eospi closed 5 years ago

eospi commented 5 years ago

Required Info
Camera Model { R200 / F200 / SR300 / ZR300 / D400 }
Firmware Version (Open RealSense Viewer --> Click info)
Operating System & Version {Win (8.1/10) / Linux (Ubuntu 14/16/17) / MacOS
Kernel Version (Linux Only) (e.g. 4.14.13)
Platform PC/Raspberry Pi/ NVIDIA Jetson / etc..
SDK Version { legacy / 2.<?>.<?> }
Language {C/C#/labview/nodejs/opencv/pcl/python/unity }
Segment {Robot/Smartphone/VR/AR/others }

Issue Description

Does the SDK have support to align the T265 with the compass such that 0 degrees is north? If not, what is the best way to implement this? Thanks!

ev-mp commented 5 years ago

@eospi, T265 is an inside-out tracking device that relies solely on self-obtained inertial and visual feeds and cannot receive external coordinates directly. Note that the device does track the level (horizon plane) by resolving the G-force vector, so even though the position and heading angle are arbitrary set as [0,0,0] and [0] on device start up, the real pitch and roll angles are always correct. Actual precision is bounded by IMU calibration tolerance

What you ask is ,imho, an open question, so I will try to outline the method for translating T265 pose into a "north-aligned" real world orientation which requires to establish (at least) two reference frames and perform matrix multiplications:

and then solve them iteratively to obtain north-aligned orientation:

Note that the [xyz] component of the resulted pose is not necessary and can be omitted in case you're interesting only in "rectifying" the North reference.

The flow with the two major cases :

  1. T265 and the compass sensors are linked/ mounted to a common frame - a) Calculate the mechanical 6DOF offset (translation and rotation) between the compass and T265 origins and use it to generate the [4x4] transformation matrix from T265 CS to compass CS.
    repeat in loop: b) Capture T265 Pose and compass azimuth. generate [4x4] transformation matrices from both. c) Multiply the T265 Pose by (a) to calculate it in the Compass CS. d) Multiply result of (c) by the compass's transformation. e) The resulted rotation component [3x3] in the [4x4] matrix will provide T265 rotation aligned to north.

2) T265 and the compass are linked in kinematic chain, i.e. the compass and T265 sensors can move/rotate relatively to each other. This scenario is not that different from the previous one, and would require some modest modifications to employ a rigid-body transformation method similar to Denavit-Hartenberg convention so that the transformation matrix established in 1.a above will be update dynamically. In reality both scenarios are technically similar with (1) being the degenerated version of (2).

There are multiple robotic and non-robotic libraries and SDK that facilitate these type of kinematic transformations, but this is out of scope of this question.