Huangying-Zhan / kitti-odom-eval

KITTI Odometry Evaluation Toolbox
MIT License
209 stars 36 forks source link

Monocular Visual Odometry 6 DOF #7

Open essamgoda opened 4 years ago

essamgoda commented 4 years ago

I'm trying to evaluate KITTI odometry first I convert pose to 6DoF using this code

def rotationMatrixToEulerAngles(self, R):
        assert (self.isRotationMatrix(R))
        sy = math.sqrt(R[0, 0] * R[0, 0] + R[1, 0] * R[1, 0])
        singular = sy < 1e-6

        if not singular:
            x = math.atan2(R[2, 1], R[2, 2])
            y = math.atan2(-R[2, 0], sy)
            z = math.atan2(R[1, 0], R[0, 0])
        else:
            x = math.atan2(-R[1, 2], R[1, 1])
            y = math.atan2(-R[2, 0], sy)
            z = 0
        return np.array([x, y, z], dtype=np.float32)

def matrix_rt(self, p):
        return np.vstack([np.reshape(p.astype(np.float32), (3, 4)), [[0., 0., 0., 1.]]])

pose1 = self.matrix_rt(self.poses[index][i])
pose2 = self.matrix_rt(self.poses[index][i + 1])
pose2wrt1 = np.dot(np.linalg.inv(pose1), pose2)
R = pose2wrt1[0:3, 0:3]
t = pose2wrt1[0:3, 3]
angles = self.rotationMatrixToEulerAngles(R)
odometries.append(np.concatenate((t, angles)))

also, model output is in the same format (6DoF)

the question is how to evaluate 6DoF results

Huangying-Zhan commented 4 years ago

Hi @essamgoda , this code is designed to evaluate poses represented by the matrix (i.e. 3x4 matrix). You don't need to convert the pose to 6DoF for evaluation.

Please check this section for pose loading

https://github.com/Huangying-Zhan/kitti-odom-eval/blob/4b850b0815dcd28edef893ca3422125b04a02397/kitti_odometry.py#L83

essamgoda commented 4 years ago

Hi @Huangying-Zhan, I'm trying to implement DeepVO like this repo it converts poses to 6-DOF as shown above for training and also the model output is 6-DOF. the question is how to evaluate 6DoF results or how to convert 6-DOF to 12-DOF to use the evaluation tool

Huangying-Zhan commented 4 years ago

Hi @essamgoda , from your provided code, it shows that you convert Rotation matrix to euler angles (3DoF) first and concatenate it with the translation vector to for the 6-DoF representation. 12-DoF is actually a truncated version of transformation matrix, which is 16DoF, i.e.

# 16DoF
R00 R01 R02 t1
R10 R11 R12 t2
R20 R21 R22 t3
0   0   0    1

# 12DoF is nothing but the first three rows of the transformation matrix, i.e.
R00 R01 R02 t1
R10 R11 R12 t2
R20 R21 R22 t3

Therefore, you don't need to convert the rotation matrix to euler angle. You can just concatenate the rotation matrix with translation vector to for the 3x4 matrix and save it in a txt file for evaluation.

Hope this helps.

essamgoda commented 4 years ago

Hi @Huangying-Zhan, the proposed model is predicted pose at 6-DOF format (translation vector concatenate with euler angles) this for I asked to convert this format to 12DoF to for evaluation.

in code mentioned above, it converts Euler angle back to rotation matrix and concatenates it with the translation vector

def eulerAnglesToRotationMatrix(self, theta):
    R_x = np.array([[1, 0, 0],
                    [0, np.cos(theta[0]), -np.sin(theta[0])],
                    [0, np.sin(theta[0]), np.cos(theta[0])]
                    ])
    R_y = np.array([[np.cos(theta[1]), 0, np.sin(theta[1])],
                    [0, 1, 0],
                    [-np.sin(theta[1]), 0, np.cos(theta[1])]
                    ])
    R_z = np.array([[np.cos(theta[2]), -np.sin(theta[2]), 0],
                    [np.sin(theta[2]), np.cos(theta[2]), 0],
                    [0, 0, 1]
                    ])
    R = np.dot(R_z, np.dot(R_y, R_x))
    return R

R = eulerAnglesToRotationMatrix(pred[3:])
t = pred[:3].reshape(3, 1)
T_r = np.concatenate((np.concatenate([R, t], axis=1), [[0.0, 0.0, 0.0, 1.0]]), axis=0)

# With respect to the first frame
T_abs = np.dot(T, T_r)
# Update the T matrix till now.
T = T_abs

but this not working well with me

the paper for the proposed model DeepVO

thanks for your replies

Huangying-Zhan commented 4 years ago

@essamgoda I see now. You can refer to the code here for the conversion

https://github.com/JiawangBian/SC-SfMLearner-Release/blob/97d4f26d4cb4567e5ece91f245af5dba441e7c77/inverse_warp.py#L139

Aliahme5636 commented 3 years ago

Hi @Huangying-Zhan I'm using same code and results are very bad when evaluate using kitti-odom-eval although RMSE is very good (RMSE between 6DOF estimated and ground truth).

also if I copy Kitti data poses for any sequence and try kitti-odom-eval error in Translational is 7.3, Rotational is Rotational error (why this happen thanks to refer to any reference or tutorial)

thanks for help

Huangying-Zhan commented 3 years ago

Hi @Aliahme5636

I'm using same code and results are very bad when evaluate using kitti-odom-eval although RMSE is very good (RMSE between 6DOF estimated and ground truth).

I have no clue at the moment. Could you provide more information?

also if I copy Kitti data poses for any sequence and try kitti-odom-eval error in Translational is 7.3, Rotational is Rotational error (why this happen thanks to refer to any reference or tutorial)

If you meant all sequences have same translational error (7.3), could you check the reading of the files? Seems to me that the wrong files are loaded. Could you explain more about "Rotational is Rotational error"?

Aliahme5636 commented 3 years ago

Hi @Huangying-Zhan I'm using deep leanning model and RSME like that

Translational RMSE 0.284642 Rotational RMSE 0.266223

although eval tool output

Translational error (%): 102.90453146933586 Rotational error (deg/100m): 18.62600435469482

all sequences have same behavior (when use Kitti data poses as estimated and also ground truth -same file- ) like below

sequence 03
Translational error (%):  3.0627180900916957e-14
Rotational error (deg/100m):  8.147885968710791e-08
sequence 09
Translational error (%):  2.4011952673267497e-14
Rotational error (deg/100m):  8.028623189108894e-08

"Rotational is Rotational error" is write by mistake it should (Rotational is 8.14) sorry for that