nutonomy / nuscenes-devkit

The devkit of the nuScenes dataset.
https://www.nuScenes.org
Other
2.28k stars 629 forks source link

Trouble interpreting the camera rotation from the sensor_calibration.json file #268

Closed Marel-88 closed 4 years ago

Marel-88 commented 4 years ago

Hi,

Just to clarify the question, we are having trouble interpreting the camera rotation from the sensor_calibration.json file, we know your data is correct but we'd appreciate having some advice from you on how to do it correctly.*

The description of our issue is the following one:

  1. theoretical interpretation: our interpretation of the situation is that in the sensor_calibration.json, all the cameras rotation and translation is given with respect to the car for every single scene, so if we plot the camera scheme for a given scene we should obtain the six cameras pointing in a different direction

  2. 1st implementation: to verify the point 1, we were using the python 'transformations' module (https://pypi.org/project/transformations/) to extract the yaw from every single quaternion from the 6 cameras, the result we obtained is that all the yaw values are very close to each other.

  3. how we created the plot: we used the 6 cameras rotation and translation from the scene 'cc8c0bf57f984915a77078b10eb33198' contained in sensor_calibration.json (just to make a toy implementation), then we employed the Matlab function plotCamera() and converted each quaternion to a rotation matrix using the function quat2rotm() provided by matlab, see code** and result in the attached image.

*. why are we doing this? just to give you a context of our project.. we are trying to project 2d detections in the image canvas to the world coordinates by having a depth aproximation based on the average height of the pedestrians (we know this is not accurate, but we want to know how far we can arrive without using the lidar), but the preliminary results showed that all the pedestrians are projected to the front of the car, and during the debugging process we identified the error in our interpretation of the camera rotations.

**matlab code:

close all clear all

scene= 'cc8c0bf57f984915a77078b10eb33198'; camera_id = '725903f5b62f56118f4094b46a4470d8'; translation = [1.70079119 0.01594563 1.51095764 1.55084775 -0.4934048 1.49574801 1.0148781 -0.48056822 1.56239545 0.02832603 0.00345137 1.57910346 1.035691 0.48479503 1.59097015 1.52387798 0.49463134 1.50932822]; rotation = [ 0.49980154 -0.50303162 0.49977981 -0.49737084 0.2060348 -0.20269406 0.68245078 -0.67136109 0.1228098 -0.13240084 -0.70043058 0.69049603 0.50378727 -0.4974025 -0.49418502 0.50454961 0.69241856 -0.70316194 -0.11648343 0.11203318 0.6757265 -0.67362665 0.21214015 -0.21122827];

for i=1:6 cam = plotCamera('Location',translation(i,:),'Orientation',quat2rotm(rotation(i,:)),'Opacity',0.2); hold on end grid on axis equal

image

Originally posted by @Marel-88 in https://github.com/nutonomy/nuscenes-devkit/issues/265#issuecomment-566426352

holger-motional commented 4 years ago

theoretical interpretation: our interpretation of the situation is that in the sensor_calibration.json, all the cameras rotation and translation is given with respect to the car for every single scene, so if we plot the camera scheme for a given scene we should obtain the six cameras pointing in a different direction

That's correct.

1st implementation: to verify the point 1, we were using the python 'transformations' module (https://pypi.org/project/transformations/) to extract the yaw from every single quaternion from the 6 cameras, the result we obtained is that all the yaw values are very close to each other.

That is of course wrong. I recommend you don't use any external libraries which may make different assumptions, but rather try the provided code and the pyquaternion library. You can see the different coordinate frames at https://www.nuscenes.org/public/images/data.png . Note that camera and lidar have different coordinate frames.

how we created the plot: we used the 6 cameras rotation and translation from the scene 'cc8c0bf57f984915a77078b10eb33198' contained in sensor_calibration.json (just to make a toy implementation), then we employed the Matlab function plotCamera() and converted each quaternion to a rotation matrix using the function quat2rotm() provided by matlab, see code** and result in the attached image.

Unfortunately I have never used these and therefore cannot help you with the details. I suggest you go step by step: