Closed RomanBapst closed 7 years ago
@LorenzMeier @eyeam3 I have investigated a bit why using the delta time reported by the driver framework does not work. I have noticed that the driver framework assumes a fixed sampling period of the sensor, e.g. 1ms and then splits that time among the samples measured during that period.
The following observation was made using a rate of 1kHz to collect sensor data while the sensor was also read out at 1kHz:
1) The number of samples in the FIFO buffer was varying a lot between 1 and 3 samples. Maybe this could be an issues because the number of packets per cycle is low passed here https://github.com/PX4/DriverFramework/blob/master/drivers/mpu6050/MPU6050.cpp#L445 and is used here https://github.com/PX4/DriverFramework/blob/master/drivers/mpu6050/MPU6050.cpp#L551 to compute the delta time for each sample in the FIFO buffer.
The following observation was made when I left the sensor sample rate at 1kHz BUT decreased the imu data collection rate to 500 Hz. (Setting this value to 2000 https://github.com/PX4/DriverFramework/blob/master/drivers/mpu6050/MPU6050.hpp#L45 )
1) The number of samples in the FIFO buffer was very constant around 2 (makes sense because we read at half of the sensor rate). As a result the filtered number of imu packets per cycle was also much more constant of course.
I have tested the effect on the Bebop and the second strategy resolves the issue we have seen with the 360 degrees turns in heading. However, I'm not yet convinced that using a constant sampling time in order to calculate a delta time vs. a dynamically calculated delta time (using the hight res timer) makes sense. Can we really assume a constant sampling interval on platforms which are not running a RTOS? Thoughts?
I think, the solution is somewhere in-between. I see an issue when using the timestamp based delta in combination with buffered sensor data. For example, if the buffer contains 2 measurements, they are read together and publish is called two times very quickly. As a result, we have one long delta and one short delta between the timestamps in the wrapper. On the other hand, the sensor should sample its measurements with a constant period and thus, a constant delta between published packages. However, in the beginning, I had a similar issue as #132 which is why I also added those lines of code you mentioned above. During the first tests, I dropped the sampling frequency from 8KHz to 1KHz with the sampling divider. This continues with the introduction of DLPF in #163 and could probably resolve the effect, that the sensor sampling is not accurate enough.
@eyeam3 What's the latest state here?
The fix should be merged as it fixes an incorrect calculation.
At the moment, the wrapper in the firmware uses hrt_absolute_time()
to calculate the sample interval and not this field value. So far, it seems to work stable. I would conclude this topic for now and if necessary look into it again in the future.
…ling
Signed-off-by: Roman bapstroman@gmail.com