mavlink / mavros

MAVLink to ROS gateway with proxy for Ground Control Station
Other
880 stars 990 forks source link

Vision Estimates and/or setpoints not rotated properly (forwarded quaternions are wrong) #523

Closed MarcGyongyosi closed 7 years ago

MarcGyongyosi commented 8 years ago

Hey guys... me again.

So when I switched to the new firmware I tried to use the external heading option which seems to work in general. I then tried to use the setpoint position plugin together with vision estimations (in a dry run) and well....

First, I think there might be a sign error somewhere since the yaw setpoint is off by PI/2 (i sent a quat of 0,0,0,1 for yaw to be 0 or "forward" expecting the yaw setpoint to be 0).

Here is a plot of the setpoint yaw and att.yaw. You can see it's off by PI which I suspect is probably to a plus/minus mistake in the rotation from ENU to NED. screen shot 2016-03-21 at 6 22 33 pm

In addition, why is att.yaw at -pi/2 ? I would expect it to be at 0 since I haven't rotated the quad.

In the case that the pi/2 offset is necessary to do the rotation from ENU to NED (which might be the case here) I can see how they need to be off by that pi/2. However, when I send a yaw=0 setpoint in a pose where yaw is already 0 then I would expect them both to be offset by pi/2 in the same direction - i.e. not setpoint +pi/2 and att.yaw -pi/2 as in the above plot.

Furthermore, I am getting some very spooky quaternions for the vision estimates coming through to the pixhawk (i.e. the visn.quatxyzw ones). the qx value jumps up and down quite dramatically every once in a while (in this plot once I send the setpoint).

screen shot 2016-03-21 at 6 44 06 pm

Maybe this behavior is unqiue to my setup (the pixhawk is rotated) and I will do my own rotations for now, but this might be something that could cause some pretty bad and unexpected behavior for others..

What do you guys think?

Best, marc

vooon commented 8 years ago

@mhkabir is that bug you mentioned yesterday?

@scott-eddy seems #473 introduce that bug.

mhkabir commented 8 years ago

@MarcGyongyosi Yes, I ran into this while testing VI-SLAM sometime back. I had to revert #473 in order to get it flying: https://www.youtube.com/watch?v=1YTOmWUwer4

scott-eddy commented 8 years ago

@mhkabir I'm sorry I haven't been able to address this sooner, I've been swamped with some other work lately. I'm guessing I won't be able to put time into it for a week or so still. I'd appreciate some help debugging here, as don't have an external vision setup ready to go yet. Here are some initial debugging tests/questions:

  1. Is yaw the only angle here that is affected here? Roll/Pitch are okay?
  2. Can anyone else verify that transforming the PX4 orientation from representing NED -> body to instead represent ENU -> body the euler angles make sense? I have done this several times as a sanity check but it would be nice to have confirmation. If this transform of attitude of NED -> body to ENU ->body is right then #473 has a bug in the entire logic, but I don't think this is the case.
  3. If the answer to 1 is that the transform does make sense thenthe transformation of the attitude ENU -> Body, to NED -> Body is at issue here. That pipeline of the code is 1. get RPY from quaternion 2. transform orientation from ENU->body to NED->body
  4. Again, if the transform of the attitude representing NED -> body to ENU-> body is okay then the easy fix for transforming the ENU->body to NED-> body attitude would be to add a new static quaternion that fixes the problem, though one should not be needed.

@mhkabir, @MarcGyongyosi , @vooon Do any of you have bandwidth to help debug this?

MarcGyongyosi commented 8 years ago

unfortunately I won't have time. need to get a lot of stuff done this week...

I'll look into it if I have some free time...

MarcGyongyosi commented 8 years ago

Alright, so I disabled all the mavros internal rotations and got it working nicely. No more jumps and weird behavior. My implementation is hacky and probably only works on my setup because of my vision setup with multiple cameras, but I can share the few lines of code that I use to rotate here:

void send_vision_estimate(const ros::Time &stamp, const Eigen::Affine3d &tr) {
        /**
         * @warning Issue #60.
         * This now affects pose callbacks too.
         */
        if (last_transform_stamp == stamp) {
            ROS_DEBUG_THROTTLE_NAMED(10, "vision_pose", "Vision: Same transform as last one, dropped.");
            return;
        }
        last_transform_stamp = stamp;

        auto position = Eigen::Vector3d(tr.translation());

        double px = position.x();
        double py = -position.y();
        double pz = -position.z();

        double roll, pitch, yaw;

        Eigen::Quaterniond q_eigen(tr.rotation());

        tf::Quaternion q_tf(q_eigen.x(), q_eigen.y(), q_eigen.z(), q_eigen.w());

        tf::Matrix3x3(q_tf).getRPY(roll, pitch, yaw);

        vision_position_estimate(stamp.toNSec() / 1000,
                px,py,pz,
                roll, -pitch, -yaw);

        /*std::cout << "input roll: " << roll << " output roll: " << roll << std::endl;
        std::cout << "input pitch: " << pitch << " output pitch: " << -pitch << std::endl;
        std::cout << "input yaw: " << yaw << " output yaw: " << -yaw << std::endl;*/

    }

This assumes the following orientation of the vision inputs (coming into mavros)

Vision estimates relative to Pixhawk at origin: x - Forward y - Left z - Up

Roll, Pitch, Yaw are according to the standard convention around these 3 axes.

I know this is not ideal since the orientation of my vision coordinate frame is non-standard, but it works properly for me.

So in the end, I have the following mapping:

Vision Input: <-> NED output (in terms of vision input) x <-> x y <-> -y z <-> -z

roll <-> roll pitch <-> -pitch yaw <-> -yaw

I hope this helps some of you!

scott-eddy commented 8 years ago

@mhkabir, @MarcGyongyosi do you guys have any updates on this? Are you both still using a reverted commit/custom mappings?

LorenzMeier commented 8 years ago

@MarcGyongyosi Can we get an update to this?

MarcGyongyosi commented 8 years ago

Hi, I am not using mavros anymore. At the time, though, it was working very well with the above changes. Best, marc

andrea-nisti commented 8 years ago

Hi, I am having same issues. I guess that if: position [x y z], orientation [qw qx qy qz] are the coordinates coming from the mocap in ENU frame (x to the front, y to the left and z up), the good transformation to be sent to px4 will be: position [x -y -z], orientation [qw qx -qy -qz] in NED (x pointing front, y to the right and z down).

I was using this method without mavros and it was working. Since actually I cannot compile mavros from source (on the raspberry it takes ages) I can't prove it. If someone can, I appreciate it.

The rotation on the quaternion will add offsets, in this way it should be fine and clean.

The RPY are calculated with:

roll    = std::atan2(2*(q.w()*q.x() + q.y()*q.z()), 1 - 2*(q.x()*q.x() + q.y() + q.y()));
pitch = std::asin(2*(q.w()*q.y() - q.z()*q.x()));
yaw  = std::atan2(2*(q.w()*q.z() + q.x()*q.y()), 1 - 2*(q.y()*q.y() + q.z() + q.z()));

from the NEW quaternion(NED).

Thomas00010111 commented 7 years ago

Any news regarding this discussion? In the source code to convert from ENU to NED the coordinate system is first rotated by 90deg (PI/2) about z and then 180deg(Pi) about x. Which makes perfect sense to me. This gives a mapping x -> y, y -> x , z-> -z which is different from the proposed solutions above. When I am sending the same position and setpoint the coordinates in log file of the pixhawk are different.

NCharabaruk commented 7 years ago

Has this been looked into any further? I'm providing a Pixhawk with a position estimate using /mavros/vsion_pose/pose and the yaw is off by 180deg.

AlexisTM commented 7 years ago

Same problem as @NCharabaruk on mavros 0.18.4 and PX4 1.5.1.

I am sending a "almost null yaw" (as I am at 0° rotation) quaternion : (x:0, y:0, z:0, w:1) to /mavros/vision_pose/pose (with external heading set to vision) and the resulting quaternion in /mavros/local_position/pose is almost 180° (x:0, y:0, z:-1, w:0)

NCharabaruk commented 7 years ago

It looks like this is due to how the NED->EDU conversion is done by mavros. I just put in a 180 offset.

AlexisTM commented 7 years ago

That's what I was about to do,, I got this right today. Thanks @NCharabaruk :D

hmchung commented 7 years ago

Hi, I am running into the same issue. Here the vision yaw and the attitude yaw is almost inverse of each other. Anyone has any clue how to fix this? Thanks in advance! capture

AlexisTM commented 7 years ago

Hi, I added a 90° offset on the position and a 90° offset on the setpoint

mhkabir commented 7 years ago

There is a missing body to inertial transform in the vision plugin, I will send in a PR when I have my laptop working again.

hmchung commented 7 years ago

@AlexisTM, Thank you for your prompt reply. I tried, but It does not help. :( What I did was add 90 deg to yaw before sending the pose to vision. No help. Then I tried my luck, rotate 90 deg, 180 deg, then compare /mavros/local_position/pose with true pose from ROS. Also no help.

@mhkabir Thank you for your hint. I tried adding some frames to account for the compass different. My TF are: local_earth -> local_origin -> uav_sp (for ease of lab coordinate system) local_earth -> vision_origin -> uav_base (basically rotate the local_earth frame to align with the compass)

The mavros vision_position TF is local_earth -> uav_base and local_earth -> uav_sp. This is what I get. Before: before After: after

Notice that the new Vision estimates were coming in right at the same value as the FMU estimates before Vision. But the local_position_estimate just jump.

Any comment is welcomed. Thanks in advance.

hmchung commented 7 years ago

hi @mhkabir I am quite interested in the off-board control mechanism and the frame transform. I did some search but could not find the code of these parts. Could you please give me some hints? Thank you

AlexisTM commented 7 years ago

http://dev.px4.io

Le ven. 3 févr. 2017 08:22, Hoang Minh Chung notifications@github.com a écrit :

hi @mhkabir https://github.com/mhkabir I am quite interested in the off-board control mechanism and the frame transform. I did some search but could not find the code of these parts. Could you please give me some hints? Thank you

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mavlink/mavros/issues/523#issuecomment-277181466, or mute the thread https://github.com/notifications/unsubscribe-auth/AGp06NRrFNZZ4kIDHYhmgV-nRHevfyBmks5rYtW-gaJpZM4H1Y9p .

mhkabir commented 7 years ago

Fixed in https://github.com/mavlink/mavros/commit/d2db9f68f66041ab423ca6906b06d5dbe539981d

AlexisTM commented 7 years ago

Thanks a lot @mhkabir

Changliu52 commented 7 years ago

@mhkabir The fix doesn't seem to work? I am using PX4 v1.6.0RC lpe firmware, and the current master MAVROS. q2

mhkabir commented 7 years ago

@Changliu52 Your analysis is flawed. How are you directly comparing the quaternion like that? That is ridiculous. Compare the Euler yaw or the position values.

Make sure :

  1. Use master LPE firmware for PX4 (px4fmu-v2_lpe). There is a critical fix for it.
  2. Make sure ATT_EXT_HDG_M is set to 1.
AlexisTM commented 7 years ago

@mhkabir As he said, he is using the PX4 1.6 RC1, LPE version. ATT_EXT_HDG_M is to be checked.

@Changliu52 Can you check you are using the right convention? (ENU through Mavros, but NED through Mavlink)

Changliu52 commented 7 years ago

@mhkabir @AlexisTM thanks for the reply.

I was using the px4fum-v2_lpe, which was compiled from the v1.6RC1 source. The related parameter settings are: ATT_EXT_HDG_M set to Vision; ATT_ACC_COMP set to Disabled; ATT_MAG_DECL_A set to Disabled; ATT_W_EXT_HDG set to 0.3; ATT_W_MAG set to 0.0; LPE_FUSION set to 140 (fuse vision position, vision yaw, baro); LPE_BAR_Z set to 10m; LPE_VIS_XY set to 0.01m; LPE_VIS_Z set to 0.01m;

Yaw fusion is approaching the negative vision yaw:

yaw

Good position tracking (x and z axis shown):

x z

Vision roll have 180 degree offset (I know it is not supposed to fuse vision):

roll

Vision pitch is reasonably aligned (I know it is not supposed to fuse vision):

pitch

This seems to me that somehow the vision orientation is upsidedown (180 degree rotated about body x axis in mavlink). I was using SVO for orientation, maybe I need to rotate the quaternion in vision_pose/pose.

mhkabir commented 7 years ago

Yes, your vision frame is wrong. You need to make sure it is in ENU.

AlexisTM commented 7 years ago

@Changliu52 Also, there is a fusion only for the heading. Roll and Pitch are not fused.

Changliu52 commented 7 years ago

The position was correct all the time. The problem was in the orientation quaternion. The orientation conversion from vision pose to mavlink was upside down, e.g. vision_pose q = (1,0,0,0) in ros results in rpy = (3.14, 0, 0) in mavlink.

manually rotated the vision pose 180 degrees about world x-axis. Seems working now: https://youtu.be/WMEQ7CppB0E

I think the mavlink use NED for world-frame, and ENU for body-frame. Only this will explain the upsidedown problem. The orientation quaternion should be the same regardless of the NED or ENU, but mixing them will result in difference.

SiddharthPatel45 commented 7 years ago

Hi guys! This has been a long read now. First, thank you everyone for contributing.

After reading through these PRs and issues: #307 #225 #208 #2078 #49 #216 #317 #2361 #438 #390 #470 #473 #5008, I'm still not sure about the respective coordinate transformations between Pixhawk and MAVROS. I read through all the issues and comments made for the past two years and am still somewhat confused on what to use. So, I'm here looking for answers and also to express what I've achieved.

Background

I am a masters student, working on System Identification of a tricopter with Pixhawk and Raspberry Pi as OBC. I have an OptiTrack motion capture system to track the body, feedback the position & attitude to Pixhawk and fly manual trajectories in OFFBOARD mode (done). I collect inputs (motor RPMs) and outputs (state estimates: position, velocity, Euler angles, angular rates) to perform sys ID using a matlab code I wrote. I collect the outputs over /mavros/local_position/*. I am able to achieve POSCTL and OFFBOARD position control using codes running on GCS. I align forward of UAV (x-axis) with the x-axis of mocap system, which I have defined ENU.

Note: @andrea-nisti I too use Raspberry Pi3 and suffered for a week (or even more) to put ROS and MAVROS on it. if you still need, you can follow this doc here to install MAVROS on it. I haven't tried with the newer version, but I think that might be solved with newer versions as I did it an year ago, anyhow, if you need it in case!

Problem

I don't have any problem executing the OFFBOARD mode via MAVROS, it works just perfect. I get mocap data from Motive software to GCS and send it to UAV directly over mavros/mocap/pose without any transformation. I assume that the ROS works on ENU frame and it converts to Pixhawk's NED frame while publishing and reverts the data back to ENU while subscribing (correct me if I'm wrong here). I confirmed this by connecting to UAV via udp and comparing the data in QGC and /mavros/local_position/pose. Till this point I assumed that all the data I got from this topic was in the following order:

So I performed the respective transformation for position and Euler angles and no transformation for velocity and angular rates as usually they are taken even for modelling of an UAV.

Now, I try to perform sys id on this data and I don't get it. My code estimates a few states properly but not all of them. So, I knew something is wrong. So, next, I tried using the IMU data from /mavros/imu/data from which you get Euler angle quaternions, angular rates and linear accelerations. I compared this with data from /mavros/local_position/* and found that the rates match exactly for both, even the angles as shown here. But, when you calculate the velocity by integrating linear accelerations from the IMU data and compare it with the local data, you get this. So, the either the velocity recorded is not in correct coordinate frame or the linear accelerations are in different coordinate frame. Also, for the angles, if you integrate the rates to get Euler angles in a similar way, they don't seem to match as you see here. All these data is recorded for a simple circular trajectory (radius 1m) at constant height. All this definitely, proves that the transformation I'm using is still wrong somewhere!

Note: @scott-eddy you were trying to access the body velocities right? How did you manage to get them? Because for my case, I just need to know the exact coordinate frame the velocities are in to perform the required transformation. Any insight would be greatly appreciated.

Observations

As I was going through the issues others faced, I found that some of you are transforming the mocap data to NED before sending it to Pixhawk and some of you aren't.

  1. Do we need to make this transformation manually or does MAVROS take care of that?
  2. What is Pixhawk's local and global frame without GPS?
  3. Which frame (local and global) and which coordinate system are local and IMU data in?

I saw that no matter in which direction you turn on Pixhawk in GPS-void space (indoors) 0 yaw is always towards the Magnetic North (due to magnetometer I guess) and the data in /mavros/local_position/* is already converted to ENU. But when we start publishing the mocap data the data in /mavros/imu/data also starts to align with the local data (not exactly same but still, as shown in figures above). Why?

I would like to know what you guys think and what & where is the correct transformation required? As I can fly perfectly and do everything I want but just that there is something wrong with the transformation somewhere, @vooon @mhkabir @TSC21 @jespestana Really appreciate your inputs. I also have the equipment at my disposal, so I can perform any tests you want to perform to verify. If this has already been sorted out, please point me towards it and I would be thankful as I was not able to get my answers.

Thanks a lot! Sid

TSC21 commented 7 years ago

I'm gonna temporarily open this so to this can be clarified. I was one of the first persons to try to solve issues related to rotations and frame transform problems. But then I got away from the project for some time and lost track of the current transformations applied. I advise you to revise what's in frame_tf.h and check if the applied transforms correspond to what you are observing.

I remind you that there are transforms applied to transform orientations, and also to change from ENU to NED and vice-versa and from baselink frame to aircraft frame and vice-versa. Please, I ask you, and since you have access to a Optitrack system, to review the current transforms and compare to the results you get to see if, after so much time around this, if there's still a flaw on some transform.

Thanks for your help!

andrea-nisti commented 7 years ago

I re-started my research on this topic just one week ago and I am setting up the system with Optitrack. I will give you feedback as soon as possible. I think that the problem can be fixed just by swapping axis in position and quaternion as I mentioned some months ago, this way will preserve position and more critical, the orientation. It is a fast fix I used years ago with custom software having the same issue; what I propose is: take values coming from mocap_optitrack node which are in ROS convention and just swap axis (quat and pos) to achieve NED preserving attitude. Otherwise we will have offsets in yaw for example (which I can see right now).

I will let you know and thank you all for your work!

@siddhu45 I have Ubuntu mate on my rasp and I get everything by apt-get install with ros kinetic right now ;)

SiddharthPatel45 commented 7 years ago

@TSC21 Thanks for reopening! I have tested the orientations in the OptiTrack with Pixhawk and posted the results in my comments above. To summarize:

Other than this, yes I have OptiTrack system, what specifically would you like me to test? The ENU - NED conversion and vice versa? As the reverse I already confirmed because the data in /mavros/local_position/* is already ENU that means the NED to ENU conversion is proper. Apart from this, what exact tests would like me to do to verify? Would be more than happy to do that.

@andrea-nisti Are you sure that you change the axis and the quaternion before sending the mocap data to Pixhawk? Because as I said I don't change anything and publish it unchanged. Still I can perform the tasks and achieve POSCTL & manual trajectories in OFFBOARD. Not sure if it will work but I can test and will notify on these results. Do you have any results I can see that this conversion of mocap data to NED actually works for you, that would be great! And smart move doing that on Raspberry Pi , seriously, I had headache to make that thing work. :D :+1:

Anyways, any inputs from others also would be great. Any testing to verify can also be done. Cheers!

SiddharthPatel45 commented 7 years ago

@TSC21 @andrea-nisti @mhkabir Ping!

andrea-nisti commented 7 years ago

Sorry, easter holidays here!

@siddhu45 Regarding the first part of your comment, I would like to point out that you can estimate yaw from vision feedback also by setting up a parameter in the estimator module. In this way, the yaw is 0 for the x axis. It means that x becomes your north and since we have lot of interference here in the lab, this is what I am doing.

@TSC21 About the rotation in mavros I have some updates!

Quick recall: mavros takes position input in ENU and applies first a rotation on Z by pi/2 and after a rotation on X by pi. This way you have NED and the position is correct (be careful that x and y will be swapped).

That said we will have:

ENU (from mocap) | NED new coordinates(in Mavros)

x <-----------------------> y = Xned

y <-----------------------> x = Yned

z <-----------------------> -z = Zned

What I do is the following, allign the robot X with X ENU and create the rigid body then start streaming its position. You will see a PI/2 yaw in the mavlink vision topic from px4 but don't worry, it is due to the rotations. X and Y axis will be swapped in NED and Z will be negative as it should be, just place your ENU frame as convenient for you and it is fine.

Here some logs (attitude) In black vision yaw in NED yawplot

In black vision roll rollplot

and finally the pitch (the offset could be sensor calibration issue??)

pitchplot

Regarding setpoints, should we send them in ENU right? What about yaw, shall we send yaw setpoints respect to ENU X axis I guess.

Thanks, Andrea

mhkabir commented 7 years ago

Regarding setpoints, should we send them in ENU right? What about yaw, shall we send yaw setpoints respect to ENU X axis I guess.

Yes, and yes.

SiddharthPatel45 commented 7 years ago

@andrea-nisti haha Thanks for posting your results! I will check them out but it will take a week though (not returning to lab for a week). Will get back as soon as I verify them and use them for my project!

Cheers Siddharth

andrea-nisti commented 7 years ago

@mhkabir @TSC21 @siddhu45 After testing, I can say that transforms are consistent and everything is working fine during flight. Is there some tests you would like to be performed in particular?

TSC21 commented 7 years ago

Dear @andrea-nisti thank you so much for your time and effort dedicated to testing. So this is considered useful for the community and doesn't get stuck on a github issue, I ask you if you please can share your testbench structure (HW, SW) and how you performed the above tests, so people can reproduce them the same way when they need it, and also an example of your results as you did above, in two different places:

  1. https://dev.px4.io/en/ros/external_position_estimation.html - create a subsection about Asserting over the rotations and transforms of an external system frames and the FCU frames or something like that
  2. http://discuss.px4.io/ - open a topic and share your results please.

Once again, thank you very much!

andrea-nisti commented 7 years ago

@TSC21 It's a pleasure for me, no problem ;) As soon as I solve a couple of problems raised few days ago, I will perform a couple of tests more, share my results and update the guide on mocap usage.

Apparently wifi and RC have the same frequency. As soon as I turn on the RC, the ping experiences spikes (up to 8 seconds!!) resulting in mocap timeouts and crashes :(

I hope to find a solution to this problem very soon so I can share my results

TSC21 commented 7 years ago

@andrea-nisti did you get into something with this? Be careful with the world related frame rotations and body frame rotations. Mainly the angular velocities are usually WRT body frame.

TSC21 commented 7 years ago

Please as soon as you can share your findings as I suggested in https://github.com/mavlink/mavros/issues/523#issuecomment-298563780. Thanks

andrea-nisti commented 7 years ago

Hey, sorry for the inconvenient. We are in the middle of a deadline incoming and working like crazy in order to deliver everything. Next week it will be over and I will share my results and update the px4 guide. Sorry again for this delay. I will share logs, we can fly pretty well and position/orientation seems right.

TSC21 commented 7 years ago

Hi @andrea-nisti that's perfect thanks! Have you applied any modifications to the frame transformations so to have acceptable/consistent results? There was an issue tracked on the odom plugin where the angular velocities, since they are body frame related, needed to be aware of the current atittude of the vehicle so to keep the rotations consistent. That's because the odom message of ROS defines that the pose is WRT world frame, while velocities are WRT body frame (applied here: https://github.com/mavlink/mavros/blob/master/mavros_extras/src/plugins/odom.cpp#L100-L103). Good luck for the projects and thank you

andrea-nisti commented 7 years ago

Hey @TSC21, right now I do everything with softwares off the shelf. I haven't think about rotation rates, need to check that too. Since everything is working I haven't touched stuff, fearing of breaking things. So basically, I need to check consistency on angular rates too (they should appear in body frame), right?

SiddharthPatel45 commented 7 years ago

Hi everyone, Sorry for being away for so long. Deadline to finish my thesis and had to go prepare a lot. Thanks @andrea-nisti for your efforts. But I can say one thing for sure @TSC21 that, without any transformations even we are able to achieve the position/orientation and everything correct in the OptiTrack Motion Capture system. But the problem as I informed earlier rises when I use the data from the /mavros/local_pose for system identification. then the inputs and the outputs are not consistent. I know my inputs and I know they are more or less correct when the UAV is flying but when I give the same inputs to the mathematical model equations, the response is not correct. That's why I was wondering about the transformations and the coordinate frame of the data that I am using.

TSC21 commented 7 years ago

I need to check consistency on angular rates too (they should appear in body frame), right?

It really depends on the convention, but for ROS they should.

then the inputs and the outputs are not consistent

I suppose you are talking about the orientation only, right?

andrea-nisti commented 7 years ago

Me again, unfortunately we got the deadline postponed to next Monday and I will be busy for another week. On the other hand, I will perform more experiments for that reason and share my results as soon as possible. Sorry for the inconvenience

TSC21 commented 7 years ago

@andrea-nisti no trouble ;) thanks for your commitment to this

SiddharthPatel45 commented 7 years ago

@TSC21 Ok & Yes. But again I got a chance to interact with someone from Zurich and they suggested me the inconsistency in input/output maybe also because I'm not using timestamps. Because I thought as I am sending commands to the UAV at same frequency, the data actual inputs and outputs I'm recording from ROS should also be in same timeline. Anyways I need to check this but we had a very bad crash last week testing some other controller. Need to repair my UAV (will take some time). Will get back to you soon. Meanwhile, if you suggests any other tests. If someone can check this timestamp thing to send mocap data to Pixahwk actually works, do let me know or else I will try a little later anyways.