argoverse / av2-api

Argoverse 2: Next generation datasets for self-driving perception and forecasting.
https://argoverse.github.io/user-guide/
MIT License
307 stars 71 forks source link

Calibration Method and Recording? #215

Closed aryasenna closed 7 months ago

aryasenna commented 11 months ago

Hello!

Can anyone tell us how did Agroverse2 sensor got calibrated? especially for intrinsic camera and extrinsic Lidar-camera The paper does not cover this topic.

I'm interested what kind of accuracy (e.g., reproj error) you achieved for the Perception Dataset, and checkerboard recording.

benjaminrwilson commented 11 months ago

Hi @aryasenna,

Thank you for your interest in the dataset. The ring cameras were calibrated using a turntable in conjunction with a chessboard. Unfortunately, we do not have numbers available to quantify reprojection error.

Ben

aryasenna commented 11 months ago

hi @benjaminrwilson

Thanks for coming back, I see there is no reproj error at the moment, is it possible to publish the checkerboard recording so the public can calibrate?

For my (academic) research, I'm checking if my calib method is better than whatever is used for Agroverse and then publish the method.

James-Hays commented 11 months ago

Hi @aryasenna

We don't have the original calibration sequences to release.

Calibration is important and difficult to get right. We think the Argoverse 2 calibration is pretty good, but I don't doubt there is room for improvement on certain scenarios. We're open to someone suggesting an improved calibration. It would need to be based on analysis of the particular driving scenarios, not a calibration pattern.

aryasenna commented 11 months ago

Hi @James-Hays

Thanks for chiming in :)

We're open to someone suggesting an improved calibration. It would need to be based on analysis of the particular driving scenarios, not a calibration pattern.

From what I gather, the classical way to compare calibration performance is with pattern, as pattern is used to calculate reprojection error. (I mean something like [1] for camera and [2] for camera-lidar). I'm interested in understanding the baseline error you have achieved, and if alternative calibration method will result in lower error.

I 100% agree that we should also check the actual driving scenario, to see when we actually have calibration problem/not. I have been working with KITTI at the moment, they provided the checkerboard pattern [3] and this helps (to some degree) for us verify and even improve the calibration, and we can use the same metrics.

It's great to hear to if you are interested in improved calibration, which happen to be the topic me and my colleagues are working on. May I contact you on your GA tech email?

[1] https://www.camcalib.io/post/what-is-the-reprojection-error [2] https://github.com/mfxox/ILCC [3] https://www.cvlibs.net/datasets/kitti/raw_data.php?type=calibration