Open tribbloid opened 3 months ago
It shouldn't be difficult to extract, but it should be tested thoroughly that it points in the right direction.
I'm not sure the fusion algo actually needs a magnetometer. In my experience it only makes things worse since it's so imprecise and is affected by the enviroment (such as the metal headrest of my chair) a lot.
that's unfortunately 1 of the only 2 methods to avoid yaw drifting (the other method is directional antenna), acceleration/gravity won't detect a thing if you are rotating around gravity as axis.
but your impression is right, in my textbook, this is the first part of a drone cascade fusion algorithm that should be improved using KF and ellipsoidal fitting.
remember my CF is just a placeholder to figure out things, I plan to move to https://github.com/mark2b/imu-fusion-rs ASAP or something more advanced (ESKF/rednose etc)
do you have a draft branch with mag reading, even if it is pointing to the wrong direction?
Unfortunately I don't have any drafts, but you could add something like this to parse_report
:
let mag_offs = reader.read_u16::<LittleEndian>()? as f32;
let mag_div = reader.read_u32::<LittleEndian>()? as f32;
let mag_x = reader.read_i16::<LittleEndian>()? as f32;
let mag_y = reader.read_i16::<LittleEndian>()? as f32;
let mag_z = reader.read_i16::<LittleEndian>()? as f32;
and apply the rotation in config_json["IMU"]["device_1"]["gyro_q_mag"]
.
BTW, in our proprietary solution we use a holistic EKF-based fusion (a fork of eskf-rs
) which has the gyro bias as a state variable, and if the glasses get rolled or pitched (i.e. the user looks up or down), the EKF can calculate the current bias very precisely, which more or less eliminates drift.
This may be a false alarm.
I almost finished the draft for AHRS fusion with magnetometer (still based on CF), but when running the first sanity test with
read_sensors.rs
, I realised that magnetometer readings are ignored:interestingly, it is also missing in Windows driver implementation:
https://github.com/MSmithDev/AirAPI_Windows/blob/fedcb2357ea6cbb9b52f8d3030fc21109c0d925b/AirAPI_Windows.cpp#L340C19-L340C24
but Linux driver has it, and it has the same update frequency as the other sensors:
https://gitlab.com/TheJackiMonster/nrealAirLinuxDriver/blob/3914214af0d099beeb5cb4495c8b1a93e1f11abe/interface_lib/src/device3.c#L487
I wonder if it is difficult to extract relevant data from the frame?
Obviously, the complete fusion algorithm could only be tested after it. Alternatively, I could switch to another glasses with an existing implementation