dlaidig / vqf

124 stars 23 forks source link

Coordinate System Description #1

Closed wqiu-calgary closed 2 years ago

wqiu-calgary commented 2 years ago

Hi, Thanks for sharing the code. Are there any documents describing the coordinate systems VQF used in the implementation? for example what is the coordinate system for the sensor body frame, and definition of the rotation the quaternion represented?

thanks in advance.

dlaidig commented 2 years ago

Hi,

Thanks for your interest in the code!

If you look at Figure 2 of the preprint (https://arxiv.org/pdf/2203.17024.pdf), you can find an illustration of the different coordinate systems used by VQF. The 9D reference frame uses the ENU convention, i.e., the y-axis points north and the z-axis points up. The sensor frame is only defined by the IMU model, since this is the frame in which the raw data is provided.

As written in Section 2.1, the output quaternion rotates vectors from the sensor frame to the earth frame, when multiplying with q on the left and q^-1 on the right. This is what is common for most orientation estimation algorithms that I have seen, but some algorithms might also report the inverse quaternion. It is also important to know that VQF outputs quaternions as [w x y z], while some other software uses [x y z w].

So if the output quaternions do not fit your expectations, the pragmatic approach would be to

  1. check if the quaternion needs to be inverted
  2. check if the w-component should be last instead of first.

I hope this helps.

wqiu-calgary commented 2 years ago

Thanks for the explanation. I am still not clear of the body (sensor) frame definition.

In your test code, sensor frame is defined such that the vertical acceleration is on the 2nd axis {0, 9.8, 0}. If the sensor data is collected by an Android phone (sensor frame is defined as: x pointing to the right side, y pointing forward, z pointing up), do I need to convert the frame so that it follows the test code sensor frame convention , or the data can be directly input to the filter? something like {0,0,9.8}?

void run(int mode) { vqf_real_t gyr[3] = {0.01, 0.01, 0.01}; vqf_real_t acc[3] = {0, 9.8, 0}; ....

}

dlaidig commented 2 years ago

I am still unsure what you mean when you talk about "defining" the sensor frame, so I will try to explain it based on this example:

In my test code, I generate very simple (and unrealistic) dummy measurement that say "the accelerometer always measures [0 9.8 0]", which means that the y-axis of the IMU is pointing upwards. If the sensor is an Android phone, that means that someone is holding the phone up. If the accelerometer measurement would be [0 0 9.8], that would mean that the z-axis of the phone points up, i.e., the phone is lying flat on a table (with the display visible).

There is nothing you need to do with the coordinates of the IMU data before feeding it into the orientation estimation algorithm. The result will then be an orientation quaternion that describes the rotation between the ENU earth frame and the sensor frame (as you described it, if it is an Android phone).

wqiu-calgary commented 2 years ago

I see. thank you.

aptperson commented 1 year ago

Hi, Firstly thanks for sharing your code and disclaimer, I am new to IMU orientation estimation.

The IMU that I am using is in "ENU" (x left, y forward, z up) and if the IMU is flat on a table I will get an accelerometer vector of [0 0 -9.8], if it was facing upwards I would get [0 -9.8 0]. In your examples above your gravity is always positive.

In light of gravity being positive in your examples, should I be inverting my IMU accelerations or converting the IMU to NED before inputting them to VQF?

Thanks in advance.

dlaidig commented 1 year ago

There are a few things in your question that might be the source of the confusion.

  1. Accelerometers measure a +9.8 m/s^2 if the respective axis is pointing up and -9.8 m/s^2 if the axis is pointing down. That is a common misconception, see e.g. https://electronics.stackexchange.com/a/450357. In my examples, it was just coincidence that the value was positive, since that depends on the coordinate system of the IMU and the orientation of the IMU.

  2. I am not sure what you mean when you say that your IMU is "ENU". ENU/NED are common conventions for defining the reference frame for orientation estimates. For defining the local sensor coordinate system in which the raw data is provided, "east" and "north" do not make sense to me. I can hold the IMU so that any axis points north...

  3. Also, the description "x left, y forward, z up" is a little bit ambiguous without a sketch. However, the only important thing is to ensure that the local coordinate system is a right-handed coordinate system (like e.g. the coordinate system that Android uses, https://developer.android.com/guide/topics/sensors/sensors_overview#sensors-coords). In the unlikely case that your IMU reports raw data in a left-handed coordinate system, that would be the only reason to multiply the raw data for one of the axes with -1.