brumster / EDTracker2

EDTracker2 Arduino sketches for Invensense MPU-xx50 based devices
106 stars 48 forks source link

Gimbal Lock? #29

Closed GMcDowellJr closed 3 years ago

GMcDowellJr commented 3 years ago

I'm experiencing what I think might be gimbal lock on my 9250 based EDTracker.

Default sketch from github flashed to the board. Yaw and Pitch Scaling at 2.00 (magnetometer and gyro bias are calibrated). Opentrack using Joystick input with Yaw set to axis 2, Pitch to axis 3, and Roll set to axis 4. Freetrack 2.0 Enhanced as output if it matters.

If I rotate EDTracker so it's sitting on its long side (which should be a 90-degree pitch), Opentrack shows about 152 degrees and yaw and roll go to -180. I can get yaw to change by moving EDTracker as if it were rolling but roll goes from 180 to -180 and back again. If I rotate EDTracker again so it's now sitting on its short side (opposite the USB port) I continue to get the same readings. When I turn the yaw/pitch scaling back to 1.00 my 90-degree rotation goes to 76 and yaw and roll go to about -170. This time I can get both yaw and roll to change as I move EDTracker.

Even if this isn't gimbal lock (and since the sketch uses quaternions I'd be surprised if it were), why are the yaw and roll changing at all? And why isn't pitch reporting 90 or 180-degrees? The graphic of the head in the EDTracker GUI and the octopus in Opentrack are both rotating in ways I wouldn't expect.

brumster commented 3 years ago

Having pitch/yaw scaling at 2.0 will double the amount of movement so, just roughly, pitching it forward 90 degrees would give you 180 degrees of in-game movement assuming linear mode. Better to test with 1:1 scaling and linear if you're trying to replicate exact movement. Remember it's just a joystick at the end of the day, and joysticks don't have a concept of rotation - the concept is that people set scaling so that an appropriate amount of head movement left or right represents full joystick axis movement - but the amount completely depends on user's preferences, distance from screen, size of screen, etc.

But yes, at 90 degree extremes you will have strange behaviour because the gravity vector is pulling straight down through one of the gyro axes; therefore it will no longer be able to acurately measure rotation around itself any more and, over time, the error on that axes will rapidly increase. This has never been a problem because considering the use case, turning your head 90 degrees in any direction will almost certainly mean you can't see your screen :D ;) !!

I assume orientation is set correctly of course, if you intend to use EDTracker in a different orientation to the default you need to set the GUI correctly. Choosing "USB right" for example, but then actually using the EDTracker in a different orientation, will cause strange behaviour as the axes will be mapped wrong.

What is the actual defect do you think? What is the expected behaviour versus observed behaviour? A video of your screen, but also showing device in front being manipulated, might help me better understand the problem!

GMcDowellJr commented 3 years ago

I’ll get set up for a video.

In the meantime, I expected the head in the EdTracker GUI to look straight up and not look up and turn 180 degrees. Same as the octopus in Opentrack.

I’ve been doing a ton of reading on quaternions and what I’m seeing is what I’d expect when using Euler angles, not Quats. I see in the code that we start with Quats but immediately convert to Euler and I wonder if that has something to do with it.

JoyShockMapper is another repo on Git that uses the gyro and accelerometer from controllers like the DS4 (which is what I’m ultimately trying to emulate) to convert movement to joystick inputs (aiming with a controller rather than sticks). I’m far from understanding it all but my sense is there’s something in it to assist with this.

GMcDowellJr commented 3 years ago

The repo; https://github.com/JibbSmart/JoyShockLibrary

GMcDowellJr commented 3 years ago

Sorry for the delay. Here's the video.

https://share.icloud.com/photos/0H5V3_YVE8qVVMm4iKikgq4dw

You can see when I pitch up that the octopus in Opentrack flips and rolls but when I roll or yaw the octopus behaves as I'd expect.

I've been doing a fair amount of research on Madgwick and Mahony filters but I haven't made any real progress because I keep running out of room to flash the code. It's right at 96% before I even start! Doesn't leave much room. I can get close if I comment out all the code that interfaces with the GUI app but even then I'm still 4-6% over the limit. The next step will be to try an external programmer to remove the overhead of the bootloader. I'm not sure it will be enough, however.

You said that the reason for what I'm seeing with pitch is because gravity is pulling through all of the sensors, yes? If that's the case, why doesn't a roll do the same thing? Something to do with the position of the accelerometer maybe?

I'm probably barking up the wrong tree with all of this. If you recall, I'm trying to get the EDTracker to work on a PS4 where we can use the DS4 controller for headlook. If I pull the gyro readings from the 9250 and push them into the PS4, headlook works in ED but if I move too quickly in a direction, or pitch too high, or look 90degrees to the side I get stuck and yaw is no longer yaw and pitch is no longer pitch. I have to roll my head, for example, to pitch up when I'm looking to the left or right in order to have any chance of getting back to center. This loss of control is what puts me in mind of a gimbal lock.

Right now I'm assuming that either ED or the PS4 is taking the gyro and accelerometer data from the PS4 and using some form of sensor fusion to correct the gyro/accel values (hence my attempts at the Madgwick and Mahony filters). Just need more room! lol

brumster commented 3 years ago

Here is a video of normal behaviour

https://www.youtube.com/watch?v=4CzG9UzcA8c

Firstly, in your video you're showing Opentrack, so there's lot of potential for it to be doing something in the middle that I can't vouch for. I can only show you the response in the EDTracker GUI (or joystick control panel) which is the limit of responsibility EDTracker has; if your device behaviour is consistent in there as per my video, it's working as intended. Keep it away from anything metalic, just in case it's affecting the mag readings (metal laptop?).

Memory, you will struggle, we battled to get EDTracker into the space available. There is a HalfKay boot loader out there you could look into if you really wanted to keep one.

I must admit I'm struggling to understand your use case. I don't understand how turning your head 90 degrees in any direction is practical...! But, anyway, formally I would say this - EDTracker was designed for head look, it works fine for rotation of your head 360 degrees (it wraps the axes) and tilt/pitch will be fine and accurate to a point, but as you approach 90 degrees on either pitch or tilt, the accuracy of the MEMS sensors reduces and you'll start to get odd behaviour - but then, how you're seeing the screen at such extreme pitch angles I'm not understanding :) !!

The DMP in the Invensense 9250 can do sensor fusion, yes, or you can read raw accel/gyro values. Raw sensor values are very little use in gameplay as they would need heavy filtering, the accel isn't really of much value for head look use cases, it's mainly the gyro and the mag you're interested in for steady positioning.

EDTracker is pretty simple - straight ahead is 0,0,0 on the X,Y,Z axes of your joystick, and spinning 180 degrees around the vertical axis is full axis limits on the joystick. The axes wrap. I am assuming linear response mode and 1:1 mapping, obviously exponential and/or a none-1:1 mapping will throw that off. If you want 90 degrees of real-world head motion to be 90 degrees of virtual joystick movement (ie. 50% on an axis) you need to run scaling at 1.0 and linear response mode.

Obviously a non-calibrated mag or gyro/accel biases throw the behaviour all over the shop - I am assuming the device is properly calibrated :D !! Stating the obvious, I know, but...

Hope some of that helps!

GMcDowellJr commented 3 years ago

Yes, that's what I get in the EDTracker GUI as well. And I completely understand your point about not being able to see the screen. Because I'm pulling the gyro values directly none of the yaw or pitch scaling is being applied. Right now I have to turn near 90 degrees to get headlook to 90 degrees. I know I'll have to address that before I'm done but my sense is that's a problem for later.... though perhaps if I solved it now it would fix the issue I'm having.

I calibrated the mag with 1,999 points and the spheres look good based on your videos and what I've seen online. I've also done the gyro bias calibration.

Speaking of gyro calibration; the MPU-9250 has automatic calibration builtin to the DMP so I'm wondering why EDTracker does it instead of letting the DMP take care of it. I'd like to see the difference between what EDTracker does and what the DMP does. Do you know what parts of the code are managing that?

Since the DS4 has both a gyro and an accelerometer I'm wondering if the motion-based headlook that the PS4 has in ED is dependent on both. Perhaps ED is doing sensor fusion based on those inputs using some form of complimentary filter. Right now I'm just offloading the gyro values. I've tried pulling the accel values as well but dealing with HID descriptors is like deciphering ancient greek right now.

Thanks for your continued help!

brumster commented 3 years ago

By calibration, I assume you're talking about the bias values? The DMP is a little bit of a "black box" on Invensense's part, yes it does the sensor fusion for you but how it's configured is somewhat hidden in the motion driver that Invensense provide - there's a block of firmware code you have to upload into it, to make use of the motion driver. This is the large block of hex code you can see in inv_mpu_dmp_motion_driver.c

It was a long time ago and Rob did a lot of that portion, I know we tried a number of approaches in the very early days, switching between using the DMP and features within the motion driver versus doing the heavy lifting ourselves, in order to check performance and lag. As you can see from the code, we use:

unsigned short dmp_features = DMP_FEATURE_6X_LP_QUAT | DMP_FEATURE_SEND_RAW_ACCEL | DMP_FEATURE_SEND_RAW_GYRO ;//| DMP_FEATURE_GYRO_CAL;

The commenting suggests to me that we once tried the gyro self-calibration feature but chose to disable it for some good reason, I suspect in periods of no motion it would kick in unexpectedly and impact performance. We use the fused data (ie. gyro and accel) but there is a feature to use just the gyro (DMP_FEATURE_LP_QUAT) if you wanted to try it. We get the raw sensor values only for debugging/display to the UI, and bias calculation. If you look in the code you'll see we don't use them for anything else.

It might be worth you pulling up the Invensense Embedded Motion Driver API documentation and tutorial, it might explain a bit more on how the DMP can be configured, how it functions.

You say you want to see the difference between what EDTracker does and the DMP does, I don't understand what you mean? EDTracker uses the DMP?

As for the last paragraph, you're on your own there I'm afraid! I would be guessing that Sony's handheld devices have a standard hardware API and that, similar to the 9250, they do as much of the heavy-lifting of sensor fusion and calculation inside the hand controller to relieve the PS4 of gruntwork. There is then presumably a standard API in the PS4's software for accessing those values, so game developers don't have to do too much worrying about working stuff out. You've essentially got to work out and reverse engineer at the USB level what those hand controllers pass to the operating system. Without knowing the API spec you'll have to try and figure it out. If it is just raw sensor values then that does make most of the EDTracker code superfluous as they've basically implemented all the heavy lifting inside the PS4.

Ah HID descriptors, another whole new thing to learn :) USB.org's HID descriptor tool is quite useful

GMcDowellJr commented 3 years ago

Re the DMP question; the 9250 documentation says this;

`4.1.2.3 int dmp_enable_gyro_cal (unsigned char enable) Calibrate the gyro data in the DMP. After eight seconds of no motion, the DMP will compute gyro biases and subtract them from the quaternion output. If dmp_enable_feature is called with DMPFEATURE- SEND_CAL_GYRO, the biases will also be subtracted from the gyro output. Parameters: enable 1 to enable gyro calibration. Returns: 0 if successful.’

EDTracker does this without this setting enabled on the DMP, yes? In order to save space I’m wondering if I can let the DMP do the work instead.

brumster commented 3 years ago

Correct, it's commented out, it doesn't take effect. We calculate bias offsets manually and push them up to the MPU. The gyro is done on every startup (ie. after being plugged in) and when a "full calibration" is requested via the UI .

Best I can suggest is give it a try! There was probably a good reason why we did it that way, but who knows, if you're struggling for space try the DMP feature instead and remove all our gyro bias'ing code in the startup phases.

GMcDowellJr commented 3 years ago

I just found it and did some testing. That 8 seconds the DMP takes to correct the bias is a long time compared to what EDTracker does without it. Seems pretty obvious why y'all did that now! lol

Thanks for your help. The last thing I'm going to try with all of this is the USB HID stuff. If that doesn't do it for me then I'll accept defeat... one more reason to switch from a console to a PC!

brumster commented 3 years ago
I'd hold fire right now, remotely decent graphics cards are about the price of a console anyway :( !!