Open weders opened 2 years ago
If I have to guess, likely a loop closure event in the SLAM system caused the camera poses to jump.
We annotate in 3D, and we rely on the camera poses to project them to 2D images. The camera poses are obtained from 1P online SLAM systems (e.g. ARKit on iPhone). The SLAM systems usually have the loop closure feature that they might optimize the camera poses and reduce drift. The side effect would be a jump in the camera pose trajectory. Which would introduce these artifacts in the dataset.
When filtering the output data, we tried to get rid of these videos that have drift in them, but a few of them eventually did get past the QA.
Thanks a lot for releasing this great dataset!
While parsing the 2D projections of the 3D bounding boxes using the provided notebook, I realized that the annotations are not correct for some objects and poses.
E.g. for scene
bottle/batch-6/7
and frame id 273 the projection is quite off (see image attached), while for other frames in the same scene the bounding box is perfectly fine.Do you have any idea, whether this is caused by the projection or in the annotation of the 3D bounding box? Thanks!
I obtained this results by running the linked notebook above for scene
bottle/batch-6/7
andframe_id = 273
.