VirtCode / SmartMouse

use your smartphone as a normal computer mouse
GNU General Public License v3.0
94 stars 6 forks source link

Cancel effects of rotation on accelerometer #14

Open LinuxinaBit opened 6 months ago

LinuxinaBit commented 6 months ago

When rotating the mouse away from flat, it seems to believe you are moving it in the direction pointing down, indicating that the gyroscope is not being properly used to cancel out this motion.

LinuxinaBit commented 6 months ago

Could this be rectified by using Sensor.TYPE_LINEAR_ACCELERATION? https://developer.android.com/reference/android/hardware/SensorEvent.html#sensor.type_linear_acceleration: Or are you already doing this?

VirtCode commented 6 months ago

When rotating the mouse away from flat, it seems to believe you are moving it in the direction pointing down, indicating that the gyroscope is not being properly used to cancel out this motion.

You are correct. The gyroscope is not currently used to correct the gravitation vector used internally, which is what is causing the behavior you described. I also think that this rotation would fix many of the artifacts seen on a surface as well. I've implemented a very naive approach some time ago, which turned out to be very wrong, so that's what I'm currently looking into.

Could this be rectified by using Sensor.TYPE_LINEAR_ACCELERATION?

As far as I am aware is this sensor pretty much the measured acceleration passed through a low pass filter, to remove the gravity offset. In my testing, this proved to be unusable for this application, which is why this app is doing a different approach for gravity compensation. Additionally, this sensor does provide a lower sample rate than the "unprocessed" sensors, which further limits accuracy. So sadly, this sensor is not really an option.

VirtCode commented 4 months ago

I've spent a couple of days in the last two weeks trying to get something working and have done a ton of testing. Sadly, I have hit a bit of a roadblock, which seems to make this unfeasible.

I'm essentially trying to rotate the gravity vector with the data of the gyroscope. The gravity vector is captured when the phone is not moving, and should then corrected/rotated with the gyroscope during the active periods to get a more accurate gravity estimate. However, this gravity estimate ends up being far from accurate. The error is so large, that I think it cannot be caused by drift or some other slight inaccuracy (which would be expected when using inertial sensors). I think that I might have missed something crucial, and am somewhat out of ideas.

So if anyone with further knowledge on this topic happens to see this, I've created a minimal example to reproduce the errors. It also further explains the testing I've done and errors I am talking about. Don't hesitate to share your thoughts and ideas.

I'll probably work on a touchpad-like mode next, to make the app actually usable.

LinuxinaBit commented 4 months ago

Have you looked into how those air mice and LG TV remotes work? They appear to just use the gyroscope which is interesting... Maybe they decided accelerometers are too unreliable for positioning? It is unfortunate because you can't really use an air mouse on a desk, but trackpads work just fine for that.

If you decide to implement both an air mouse and trackpad (and maybe also keyboard functionality), this app would likely replace bluetooth peripherals in many more situations, with higher reliability to boot.

VirtCode commented 4 months ago

Yes, air mice only have to rely on the orientation of the device. I've already seen tons of research about this type of sensor fusion for orientation estimation, which mainly use the gyroscope and correct the data with the accelerometer and the magnetometer. What I try to do there is almost the other way around, augmenting the acceleration data with the gyroscope, which I didn't find much research for. Probably there is a reason why this is, as you said, maybe this type of position estimation is just unfeasible.

I might consider also implementing an air mouse in the future (as already suggested in #16). This would be much easier, by just using some already tried and tested sensor fusion implementation.