Xinyu-Yi / TransPose

A real-time motion capture system that estimates poses and global translations using only 6 inertial measurement units
https://xinyu-yi.github.io/TransPose/
GNU General Public License v3.0
373 stars 72 forks source link

Data synthesis for sensor relative orientations #15

Closed VimalMollyn closed 2 years ago

VimalMollyn commented 2 years ago

In your paper you mention you synthesise global virtual "sensor" orientations. I'm confused how I might go about synthesising sensor relative orientations from the amass dataset (so that I can use this model with imus that only provide sensor relative measurements). Do you guys have any insights?

Thanks! Vimal

Xinyu-Yi commented 2 years ago

Hi Vimal,

What do you mean by "relative" orientations? In our implementation, for example, the orientation of the sensor placed on the left arm is approximated by the global orientation of the left arm, which can be computed from the human pose in the AMASS dataset.

VimalMollyn commented 2 years ago

So most commodity IMUs (such as the BNO055) provide absolute orientations (Angle axis/ quaternions) relative to a coordinate frame that is oriented according to the magnetometer and gravity axes - this is what I mean by "sensor relative" orientations. However, the rotations computed from AMASS (as described in TransPose/DIP) are computed relative to a fixed reference frame (Mentioned as the SMPL body inertial frame in both papers) using forward kinematics. How would I go about synthesising "sensor relative" orientations from the AMASS dataset?

Xinyu-Yi commented 2 years ago

You can left-multiply the synthetic orientation by an arbitrary (but fixed) y-axis rotation matrix, i.e., as long as the y(gravity) axis of the SMPL body frame coincides with the y axis of the north-east-earth inertial frame. Will this solve your problem?

Xinyu-Yi commented 2 years ago

But if you want to synthesize the raw measurements (that contain walking noises or magnetic field distortion), it would be difficult

VimalMollyn commented 2 years ago

This makes sense, I'll try it out. Thanks for the detailed reply!

Xinyu-Yi commented 2 years ago

A small reminder: the world coordinate frame in amass dataset is not y-gravity. You need to first rotate the global (root) orientation by a left rotation matrix [[1, 0, 0], [0, 0, 1], [0, -1, 0.]] (see preprocess.py). Then the world frame y-axis coincides with the gravity direction. You can then calculate the global orientation of each bone expressed in such a world frame by forward kinematics. You can then left multiply any rotation around y-axis to change the north/east direction, as long as the same rotation is applied on the whole sequence.

VimalMollyn commented 2 years ago

Yup, I noticed that! Thanks!