PX4 / PX4-Autopilot

PX4 Autopilot Software
https://px4.io
BSD 3-Clause "New" or "Revised" License
8.21k stars 13.38k forks source link

Trouble using a Vicon mocap system for complete attitude estimates #19212

Open evbernardes opened 2 years ago

evbernardes commented 2 years ago

I'm currently studying a problem in which I need a good attitude estimate at all times (and at this stage I am using a Vicon mocap system to have the complete pose estimate).

Everything seems to be set correctly, I'm publishing to /mavros/vision_pose/pose but in the px4 parameters, I could only find an option to fuse the yaw vision data into the EKF2, when when comparing the PX4 full estimate with the mocap estimate, they drift apart pretty quickly.

What is the best way to tell PX4 to, basically, completely trust the Mocap system, or at least trust it a lot more?

Also, since I had to code a module into the Firmware for my tests, would using vehicle_visual_odometry uorb topic inside instead of the usual vehicle attitude one be a good alternative?

Thanks in advance!

bresch commented 2 years ago

Hi @evbernardes ,

You can select the vision position, velocity and yaw fusion in the aid mask. You cannot fuse roll/pitch aiding and it isn't really useful, you won't get a better estimate except if you get everything timed-sync properly and even with that, the improvement might be insignificant. So with position, velocity and yaw vision aiding, the EKF should follow your MOCAP system pretty well. If it's not the case, there is a problem. Could you share a log file?

evbernardes commented 2 years ago

Thanks for the input! For debugging purposes, it would actually be pretty useful for my particular problem to fuse roll and pitch. If I can't fuse it, would using vehicle_visual_odometry inside my code on the firmware side be a possible solution for now? Thanks again! :)

bresch commented 2 years ago

I don't see why anyone would need to fuse roll and pitch, but let's say you have a good reason to do so. I you want, you can add an extra fusion function into EKF2 to make a quaternion update or as you said, you could use vehicle_visual_odometry in you module. However, be careful with this because if your datalink breaks, lags, if there are dropouts or too much delay, the drone will probably crash. An idea would be to use that visual odometry as the primary source of attitude in you new module and fallback to the EKF2 estimate in case of problem.

evbernardes commented 2 years ago

I see, thanks! Do you have any tips on how would be the best way to do this quaternion update, and where exactly in the code? As a quick last question: do you have an example somewhere of what would be the best way to detect a problem if I end up using the vehicle_visual_odometry route?

bresch commented 2 years ago

Add one of those control...Fusion function (https://github.com/PX4/PX4-Autopilot/blob/c0facec8891a26282505dd5e83115f47c5695df7/src/modules/ekf2/EKF/control.cpp#L154-L169) that runs a reset and fusion functions containing the fusion equations you generated. Out of curiosity, why do you need direct attitude aiding? Is the attitude not good enough? There's normally less than 1deg of tilt error and very little delay given that the attitude prediction runs on the high rate IMU output. Can you really get better results than this with a MOCAP system?

As a quick last question: do you have an example somewhere of what would be the best way to detect a problem if I end up using the vehicle_visual_odometry route?

Just put some checks for data timeout, sampling regularity, ...

evbernardes commented 2 years ago

Ok thanks, I'll try to check this out! I'm currently working on the attitude control of a spinning monorotor project, and the attitude prediction seems to drift pretty fast on the other angles when the robot starts to quickly spin, while our pure mocap estimation seems to do it a lot better.

bresch commented 2 years ago

@evbernardes Something like this: https://www.youtube.com/watch?v=P3fM6VwXXFM&ab_channel=WeixuanZhang ? Just to be sure, if the IMU is not located at the center of mass, did you set the offset parameters properly (for proper centrifugal acceleration compensation)? I mean, I've seen racing drones yawing faster than this for several seconds without any issue, so I'm a bit worried that you'll loose quite some time trying to fix a configuration problem by adding more code and that it won't really work nicely at the end. Again, if you have a log file I could have a look and maybe give you some tips.

evbernardes commented 2 years ago

I will try to recheck this again, thanks! I am currently doing tests where the robot cannot move: it is fixed in a single rotatory joint and can only rotate freely around its spin axis, in order to test the attitude estimation, and even so the roll and pitch keep drifting when they should be constant*

(* by constant, I don't mean roll and pitch exactly: I analyze it with ZYZ angles sequence decomposition, and the first and second angles keep constant according to the mocap as expected while they go crazy according to the estimation)