Xinyu-Yi / TransPose

A real-time motion capture system that estimates poses and global translations using only 6 inertial measurement units
https://xinyu-yi.github.io/TransPose/
GNU General Public License v3.0
382 stars 75 forks source link

Please help me modify code to fit my sensor! #17

Closed SlimeVRX closed 2 years ago

SlimeVRX commented 2 years ago

Hi!

I'm trying to reproduce paper results using lower-cost IMU.

Since I am not tall enough (1m63) and have not edited the leg_length information, the estimate is not accurate. I tried and tested a lot to get the first results: Video Link

IMU: I'm supported six Hi229 9-dof from HiPNUC (datasheet) Frequency: 50Hz (record 100Hz step=2)

I recorded the data(data_align, data_T_Pose) and modified live_hi229.py based live_demo.py:

The acceleration read from Noitom is global. But in my data: the acceleration is local, and the quaternion is local. I don't know how to convert local to global acceleration, the local accelerometer is passed to the model. This could be the reason the body estimate is not accurate.

Can you help me revise the code to fit my data? (live_hi229.py) If you are too busy, read through the code once and give me some advice. Thanks a lot!

Xinyu-Yi commented 2 years ago

Hi!

I have several questions. Could you find out these things:

1) Which coordinate frame is the sensor "quaternion" expressed in? Is it the global inertial frame (shared by all IMUs)?

2) Which coordinate frame is the sensor acceleration expressed in? Is it the sensor frame (rotated together with the sensor)?

3) Make sure the unit of the acceleration? Is it g or m/s^2? Is the acceleration = g when the sensor is still (or = 0)?

If you can find out these things, I can help you with your code.

p.s. If you speak Chinese, you can send your wechat to me.

Xinyu-Yi commented 2 years ago

To transform the acceleration in the sensor frame (local) to the inertial frame (global), just multiply the sensor rotation with the column-vector acceleration (Ra). Maybe you can try this.

SlimeVRX commented 2 years ago

Hi!

I have several questions. Could you find out these things:

  1. Which coordinate frame is the sensor "quaternion" expressed in? Is it the global inertial frame (shared by all IMUs)?
  2. Which coordinate frame is the sensor acceleration expressed in? Is it the sensor frame (rotated together with the sensor)?
  3. Make sure the unit of the acceleration? Is it g or m/s^2? Is the acceleration = g when the sensor is still (or = 0)?

I set the sensor in 9-DOF AHRS mode. I record the data of each sensor, without a combination of sensors together.

  1. Quaternion is local, and not shared by all IMUs.
  2. Acceleration is expressed in the sensor frame (rotated together with the sensor).
  3. Unit of acceleration is m/s^2. Unit of raw acceleration is mG but I converted it to m/s^2 using the formula *9.81/2^10. When the sensor is still, the acceleration value is shown in the figure below.

image

Xinyu-Yi commented 2 years ago

You just need to convert the acceleration to global. Then check whether your code can work.

You can debug by: 1) put all 6 imus on the desk, keep them still. 2) run your live demo (you do not need to wear or align the imus), and print the calibrated measurements sent to the network. You should get all identity rotation matrix and nearly-zero accelerations, Then throw a sensor into the air (if you can do this), and you will get g acceleration in y axis

SlimeVRX commented 2 years ago

I followed your instructions to debug:

  1. put all 6 imus on the desk, keep them still. Recorded the data Stand.
  2. run live_debug.py. I get the values rotation matrix and nearly-zero accelerations.

Acc [6,3] image

Ori [6,3,3] image

image

image

image

image

image