papagina / RotationContinuity

Coder for "On the Continuity of Rotation Representations"
MIT License
328 stars 54 forks source link

About the function that transforms your 6D Representation to Rotation Matrix #2

Closed ismarou closed 5 years ago

ismarou commented 5 years ago

Hi again! May I ask you something more about your project code?

The function you are using to Transform your 6D output vector from the Last Linear layer to a Rotation Matrix that belongs in S0(3) assumes that the 6D output vector is consisted by 2 3D vectors that are orthogonal and doesn't use the Gramm-Schmidt-like process you are describing in your paper for the second column. Is it a bug or am I missing something and you have a reason to assume this orthogonality?

Thanks in advance :)

papagina commented 5 years ago

Hi, that function uses a different orthogonalization formula, but can still provide similar results as the formula in the paper. And it doesn't assume orthogonality in the two 3D vectors.

ismarou commented 5 years ago

Yeah, you are right about the orthogonality, my bad. Indeed, I tried them both out and the results are similar in my application as well. I'm closing the issue, thanks:)

connellybarnes commented 4 years ago

The equation in the paper and Yi's code are actually mathematically equivalent: they are two different ways of constructing the same orthogonal basis. We established this using properties of dot and cross products and also checked it numerically.

extragoya commented 3 years ago

I had a similar question actually, when I tried the version in the code vs the code in the paper, the numbers are the same but one vector is negated, which turns the determinant negative (see below). Am I missing something here?

`

a1 = np.random.randn((3))

a2 = np.random.randn((3))
b1 = a1 / np.linalg.norm(a1)
b2 = a2 - np.dot(a2, b1) * b1
b2 = b2 / np.linalg.norm(b2)
b3 = np.cross(b1, b2)
R1 = np.stack((b1,b2,b3), axis=1)
det = np.linalg.det(R1)
print('R1', R1)
print('First det', det)
b3 = np.cross(b1, a2)
b3 = b3 / np.linalg.norm(b3)
b2 = np.cross(b1, b3)
R2 = np.stack((b1,b2,b3), axis=1)
det = np.linalg.det(R2)
print('R2', R2)
print('Second det', det)

`

Jhc-china commented 3 years ago

I had a similar question actually, when I tried the version in the code vs the code in the paper, the numbers are the same but one vector is negated, which turns the determinant negative (see below). Am I missing something here?

`

a1 = np.random.randn((3))

a2 = np.random.randn((3))
b1 = a1 / np.linalg.norm(a1)
b2 = a2 - np.dot(a2, b1) * b1
b2 = b2 / np.linalg.norm(b2)
b3 = np.cross(b1, b2)
R1 = np.stack((b1,b2,b3), axis=1)
det = np.linalg.det(R1)
print('R1', R1)
print('First det', det)
b3 = np.cross(b1, a2)
b3 = b3 / np.linalg.norm(b3)
b2 = np.cross(b1, b3)
R2 = np.stack((b1,b2,b3), axis=1)
det = np.linalg.det(R2)
print('R2', R2)
print('Second det', det)

`

I think the outer product in 'Second' is

b2 = np.cross(b3, b1)

corresponding to the project code

y = cross_product(z,x)