j20232 / survey

🍓 Survey repository
9 stars 0 forks source link

An Analysis of SVD for Deep Rotation Estimation #61

Open j20232 opened 4 years ago

j20232 commented 4 years ago

Summary

Continuous 9D unconstrained representation followed by a SVD projection onto SO(3) works well as 3D rotation representation in NNs

Abstract

Symmetric orthogonalization via SVD, and closely related procedures, are well-known techniques for projecting matrices onto O(n)O(n) or SO(n)SO(n). These tools have long been used for applications in computer vision, for example optimal 3D alignment problems solved by orthogonal Procrustes, rotation averaging, or Essential matrix decomposition. Despite its utility in different settings, SVD orthogonalization as a procedure for producing rotation matrices is typically overlooked in deep learning models, where the preferences tend toward classic representations like unit quaternions, Euler angles, and axis-angle, or more recently-introduced methods. Despite the importance of 3D rotations in computer vision and robotics, a single universally effective representation is still missing. Here, we explore the viability of SVD orthogonalization for 3D rotations in neural networks. We present a theoretical analysis that shows SVD is the natural choice for projecting onto the rotation group. Our extensive quantitative analysis shows simply replacing existing representations with the SVD orthogonalization procedure obtains state of the art performance in many deep learning applications covering both supervised and unsupervised training.

Author

Journal/Conference

arXiv preprint 2020

Subjects

Comment

Link