Open goldbattle opened 2 years ago
Hi Patrick, thanks for your questions.
The /natnet_ros/rigid_bodies/Phasma/pose
is in the marker frame and not in the /alphasense/imu
frame. The motion capture system and the IMU / Cameras are not sync'ed, timestamping was done on "time of arrival". There is an offset of a few ms between the motion caption system and the camera clock. I am currently uploading a new dataset (LAB_2) with the groundtruth trajectory sync'ed (post processing) and in transformed to the imu frame!
The trajectories should be submitted in the /alphasense/imu
frame.
We do not provide a script to evaluate the trajectories right now, but we are planning on releasing it soon!
Best Michael
@hemi86 When to release the script? And I noticed that some incorrect measurements in the 3DoF ground truth. I write a script to align the ground truth and our result. The picture bellow is the trajectories of Basement_1. The marker '+' denotes the i-th ground truth measurement, but the 6th is incorrect. And miss a ground measurement near the 4th. Thanks.
indeed there was a wrong point in the file, we uploaded a new version! Thanks for letting us now!
For evaluation using ATE (Absolute trajectory error), it is mentioned that "For example, after converting the pose to a transformation matrix Twi, one should be able to transform the homogeneous point coordinates in IMU frame to world frame as pw = Twi * pi."
My question is, if the transformation"Twi" is provided by you?
If not then, can starting 6DOF pose (tx ty tz qx qy qz qw) of (0, 0, 0, 0, 0, 0, 1) be used?
And will there be trajectory alignment in the evaluation?
indeed there was a wrong point in the file, we uploaded a new version! Thanks for letting us now!
In 6dof ground-truth (https://storage.googleapis.com/hilti_challenge/uzh_tracking_area_run2_imu.txt) , I think there are few glitches . ( sudden jump in position).
Is it significant to affect the evaluation results?
For example, in the following timestamps :
1630577781.07 0.0429409099274 0.0260854245239 0.592175765258 -0.9994178967651917 -0.025314835102480504 0.021278231497765976 0.008382339381251547
1630577781.37 0.0429409099274 0.0260854245239 0.592175765258 -0.9994178967651917 -0.025314835102480504 0.021278231497765976 0.008382339381251547
1630577782.53 0.0429409099274 0.0260854245239 0.592175765258 -0.9994178967651917 -0.025314835102480504 0.021278231497765976 0.008382339381251547
@gs14iitbbs We use evo to align 6DOF trajectory. Yes, there are few glitches in 6DOF gound truth. We just delete these poses.
@gs14iitbbs for the final evaluation we make sure no outliers are in the ground truth, and as @zerolover suggested evo is a good tool to compare two trajectories. For our evaluations we will use https://github.com/uzh-rpg/rpg_trajectory_evaluation.
I was hoping you could tell us which method for alignment you'll be using for evaluation:
"Different trajectory alignment methods (rigid-body, similarity and yaw-only rotation)"
we will use rigid-body (se3) alignment
I was wondering how to evaluate with the
LAB_Survey_1.bag
or if there was a already pre-processed groundtruth trajectory for this. Is this/natnet_ros/rigid_bodies/Phasma/pose
the marker frame of the sensor platform or already in the IMU frame of/alphasense/imu_adis
?It looks like the base frame is the
/alphasense/imu
/imu
, so is this what trajectories should be submitted in?Is the vicon timestamps sync'ed with the IMU & camera or is this offset needed?
Do you have an example processing script / commands to run to evaluate a generated trajectory against this?
Thanks.