Closed JD-ETH closed 4 years ago
My suspicion is that this has to do with the coordinate systems. Unity actually uses a left-hand coordinate system under the hood, while nearly all robotics applications are right-handed. We have a brief note here explaining the coordinates. If you are querying the simulation via tesse-interface
, then you'll be getting back information in the left-handed coordinate system. For the competition, I believe we are providing pose information using a right-handed system as that is a more typical convention for robotics.
Consider it closed as this seems more of an unity issue.
I have trouble properly correlate different frames of observations into a common frame, can you maybe point out if any of my assumptions made are wrong:
getCamerInformation
, thedepth/rgb_left
camera is located at (-0.05, 0, 0) with identity rotation of the robot frame.However, my reprojected point clouds are always misaligned whenever rotation is present. If I relocate the camera to be at (0,0,0) instead, the point clouds look perfectly aligned in the common global frame. Thus I would like to ask again whether the assumption I made about the extrinsics is correct? That they are defined in the robot frame that shares the same coordinate system convention as the image frame? East-Down-North = x-y-z
Thanks a lot!