ethz-asl / ethzasl_sensor_fusion

time delay single and multi sensor fusion framework based on an EKF
327 stars 161 forks source link

mobile tests #8

Open aswinthomas opened 11 years ago

aswinthomas commented 11 years ago

Hi all, I am currently trying to get the package run on an odroid platform with ubuntu. I am not sure if I am doing things right, so please advise.

1) I am using a setup rig with a watec camera at 30fps (for ptam) and xsens IMU at 100Hz. https://dl.dropbox.com/u/8948006/setup.jpg. Is it necessary to specify the distance and orientation diff between IMU and camera anywhere? 2) I run ptam and sensor_fusion on odroid(which is on the rig) and checked if all topics are being subscribed and published as required. 3) I start ptam. It runs at 22 fps, but with sensor_fusion this drops to 15fps. I guess this is a hardware limitation. Running IMU at higher rates could be worse. 4) I set the accelerometer and gyro noise as per calibration cert of the IMU. I then start the filter remotely (no other settings have been changed). 5) I excite the system so that all velocities reach close to zero and give smooth output. 6) Then I switch on ptam visualizer and fusion visualizer. 7) I tried moving the rig to simulate take off, move around and reduce altitude. Results: https://dl.dropbox.com/u/8948006/fusion1.png https://dl.dropbox.com/u/8948006/fusion2.png https://dl.dropbox.com/u/8948006/fusion3.png https://dl.dropbox.com/u/8948006/fusion4.png Yellow color is PTAM and green color is sensor_fusion.

5) In general, the fused data does not move as much as the update sensor. Is this expected? 6) Once ptam loses track of the map, the filter does not output anymore data. Is it true that there is no prediction if there is no update? 7) In the light of higher computational cost, is it better to use ptam alone for flight? In what situation would the filter be useful?

Regards Aswin

simonlynen commented 11 years ago

Hi Aswin,

1) yes you need to specify camera-imu attitude (and roughly position) in the file pose_sensor_fix.yaml. Expressed in the body (IMU) frame. 2) Such a large influence of the sensor_fusion package on the ptam frame-rate sounds strange to me, unless you are processing this on a single-core machine. Given that you use the odroid, there should be sufficient processing power so that the filter can run alongside ptam. With our setup we use IMU data at rates from 100Hz to 2kHz and PTAM is running at 30 Hz. 3) the filter-movement compared to the ptam movement depends on the scale which is estimated. It looks to me that you initialized the scale with a value of 30. However you have to provide a value which relates the initial stereo-pair distance of ptam to a metric value. 4) publishing of the pose is stopped if there is no update. However the propagation continues in the background. 5) As I wrote before, the computational cost should not be that high. The filter outputs the attitude at a higher rate which will be crucial for control.

Simon

simonlynen commented 11 years ago

Aswin,

something else. It would probably be a good idea to get raw IMU measurements as Markus stated in the mail yesterday. Without gravity it will be hard to estimate the attitude correctly.

Simon

aswinthomas commented 11 years ago

Hi Simon, As far as I know, the xsens MTi sends out acceleration with gravity. Does a bias_z of 10 mean that there is no gravity?

I will update the attitude in the config file and try again. So if I understand correctly, if I do not need attitude information (i.e. only 3DOF of ptam), I dont need to use the filter? Is it possible to get velocity information from ptam?

Pardon me for my lack of knowledge in the subject.

Aswin

simonlynen commented 11 years ago

Hi Aswin,

yes, as Markus wrote you must use an IMU which provides measurement including gravity, otherwise two things are problematic: 1) The attitude w.r.t to the world frame is not observable any more, because the PTAM attitude will drift over longer distances travelled. 2) To calculate the gravity compensation, the IMU uses the gyro-readings. However the EKF basic theory does not allow these two measurements to be correlated.

If you want only position information, the filter can still provide an estimate of the scale, thus giving you access to metric measurements.

Ptam is returning absolute values with respect to the first frame of the stereo-init pair. So you can calculate velocities, however if you want to use this information for control, a higher rate of smoothed position estimates will be necessary which the filter can provide.

If you however use e.g. a AR-Drone, which stabilizes itself, you can also use PTAM directly for position control in a non metric frame.

Simon

stephanweiss commented 11 years ago

Hi Aswin,

If I may join the conversation, I would like to ask you to provide us the following:

Besides that, you can

Best Stephan


From: Simon Lynen [notifications@github.com] Sent: Wednesday, February 06, 2013 11:50 PM To: ethz-asl/ethzasl_sensor_fusion Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Aswin,

yes, as Markus wrote you must use an IMU which provides measurement including gravity, otherwise two things are problematic: 1) The attitude w.r.t to the world frame is not observable any more, because the PTAM attitude will drift over longer distances travelled. 2) To calculate the gravity compensation, the IMU uses the gyro-readings. However the EKF basic theory does not allow these two measurements to be correlated.

If you want only position information, the filter can still provide an estimate of the scale, thus giving you access to metric measurements.

Ptam is returning absolute values with respect to the first frame of the stereo-init pair. So you can calculate velocities, however if you want to use this information for control, a higher rate of smoothed position estimates will be necessary which the filter can provide.

If you however use e.g. a AR-Drone, which stabilizes itself, you can also use PTAM directly for position control in a non metric frame.

Simon

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-13225309.

aswinthomas commented 11 years ago

Hi Stephan, Please give me a few days to reply since I am not in lab. My system setup is shown below: https://dl.dropbox.com/u/8948006/setup.jpg -- side view IMU+camera https://dl.dropbox.com/u/8948006/setup2.jpg -- top view IMU+camera

Cheers

aswinthomas commented 11 years ago

Hi Stephan, Thank you for the suggestion on set heights. It works. Acceleration plot when stationary: https://dl.dropbox.com/u/8948006/accplot.png Does this look right?

Screenshot of dynamic reconfigure: https://dl.dropbox.com/u/8948006/dynamic.png

parameters in config file: init/q_ci/w: 1.0 init/q_ci/x: 0.0 init/q_ci/y: 0.0 init/q_ci/z: 0.0

init/p_ci/x: 0.0 init/p_ci/y: -10.0 init/p_ci/z: -5.0

Thank you Aswin

stephanweiss commented 11 years ago

Aswin,

Thanks for the plots and the data. Acc plot and reconfig parameters look mostly good. Looking at the rest, it surprises me that the filter works - given the following:

Let me know if the filter works correctly after these corrections.

Best Stephan


From: aswinthomas [notifications@github.com] Sent: Thursday, February 14, 2013 1:50 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan, Thank you for the suggestion on set heights. It works. Acceleration plot when stationary: https://dl.dropbox.com/u/8948006/accplot.png Does this look right?

Screenshot of dynamic reconfigure: https://dl.dropbox.com/u/8948006/dynamic.png

parameters in config file: init/q_ci/w: 1.0 init/q_ci/x: 0.0 init/q_ci/y: 0.0 init/q_ci/z: 0.0

init/p_ci/x: 0.0 init/p_ci/y: -10.0 init/p_ci/z: -5.0

Thank you Aswin

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-13540027.

aswinthomas commented 11 years ago

Hi Stephan, 1) The filter was initialized 1m from the ground. 2) Regarding q_ci parameters, if the camera is looking down, isnt the zaxis of ptam also facing down? the IMU z axis is also facing down. There could be a 180 degree yaw difference in x and y axis. 3) Thanks for the suggestion in p_ci values. I have corrected them.

At present I can obtain good values for velocity. The position however seems wrong (shows 50m in x and 100m in y direction). I guess this is expected. My basic requirement was velocity (m/s from ssf_core) and camera frame position (from ptam). I hope this will be enough to control a helicopter.

Thank you for your support. Cheers

icoderaven commented 11 years ago

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5 init/q_ci/x: 0.5
init/q_ci/y: 0.5 init/q_ci/z: 0.5

init/p_ci/x: 0.10
init/p_ci/y: 0.0 init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal. https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall) Screenshot from 2013-04-16 01:08:59

stephanweiss commented 11 years ago

Hi

Yes, you can use a front facing camera with our framework (and PTAM). The view-direction of the camera is irrelevant for our algorithms as long as you initialize the states near to the true values.

That is:

I am currently on travel, hence the short answer. I will be back on Friday.

Best Stephan


From: Kumar Shaurya Shankar [notifications@github.com] Sent: Monday, April 15, 2013 10:15 PM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5 init/q_ci/x: 0.5

init/q_ci/y: 0.5 init/q_ci/z: 0.5

init/p_ci/x: 0.10

init/p_ci/y: 0.0 init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal. https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall) [Screenshot from 2013-04-16 01:08:59]https://f.cloud.github.com/assets/1692520/384454/30a9a902-a654-11e2-8eaf-9182a016d405.png

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-16427022.

markusachtelik commented 11 years ago

You should also rotate your IMU coordinate system to the convention we use (somewhat ENU): x forward, y left, z up. Otherwise, gravity gets subtracted into the wrong direction during prediction…

stephanweiss commented 11 years ago

yes, that's a good point. So either you change your IMU coordinate system or you change the gravity vector defined in the [sensor]_measuerments.h file.

Best Stephan


From: Markus Achtelik [notifications@github.com] Sent: Wednesday, April 17, 2013 9:54 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

You should also rotate your IMU coordinate system to the convention we use (somewhat ENU): x forward, y left, z up. Otherwise, gravity gets subtracted into the wrong direction during prediction…

From: stephanweiss notifications@github.com<mailto:notifications@github.com> Reply-To: ethz-asl/ethzasl_sensor_fusion reply@reply.github.com<mailto:reply@reply.github.com> Date: Wednesday, April 17, 2013 17:20 To: ethz-asl/ethzasl_sensor_fusion ethzasl_sensor_fusion@noreply.github.com<mailto:ethzasl_sensor_fusion@noreply.github.com> Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi

Yes, you can use a front facing camera with our framework (and PTAM). The view-direction of the camera is irrelevant for our algorithms as long as you initialize the states near to the true values.

That is:

I am currently on travel, hence the short answer. I will be back on Friday.

Best Stephan


From: Kumar Shaurya Shankar [notifications@github.commailto:notifications@github.com] Sent: Monday, April 15, 2013 10:15 PM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5 init/q_ci/x: 0.5

init/q_ci/y: 0.5 init/q_ci/z: 0.5

init/p_ci/x: 0.10

init/p_ci/y: 0.0 init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal. https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall) [Screenshot from 2013-04-16 01:08:59]https://f.cloud.github.com/assets/1692520/384454/30a9a902-a654-11e2-8eaf-9182a016d405.png

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-16427022.

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-16516201.

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-16518297.

icoderaven commented 11 years ago

Aha. That explains it. Thanks! I'll change the coordinate systems and try. Do you think this coordinate convention should be specified on the ROS page?

markusachtelik commented 11 years ago

We're using the standard ros conventions described here: http://www.ros.org/wiki/geometry/CoordinateFrameConventions , but it would indeed make sense to put that link on the page.

stephanweiss commented 11 years ago

yes, I will update the tutorials as soon as I am back. I am unsure if the IMU coordinate system is the main cause of your issues since I assume you initialized all corresponding attitudes using this coordinate frame (so it should be consistent). Usually, a "wrong" g vector just results in a bias estimate of about 18 in z and some other but minor inconsistencies. I will have a closer look at your bag file and the rotations on Friday. In the meantime, please also check your vision-world initialization and the correct linking of the topics.

Best Stephan


From: Kumar Shaurya Shankar [notifications@github.com] Sent: Wednesday, April 17, 2013 10:50 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Aha. That explains it. Thanks! I'll change the coordinate systems and try. Do you think this coordinate convention should be specified on the ROS page?

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-16521520.

icoderaven commented 11 years ago

Hi Stephan, Markus

I've tried changing my camera's orientation to point downwards so as to coincide with the IMU's NED frame, and even tried flipping the signs on the accelerations in the imu topic (so that, e.g., my z coordinate showed positive 9.8ish values ), but to no avail.

When I track both the /vslam/pose and the /imu/data topics, rviz shows me that they are both oriented in a NED frame and their rotations correspond as well, so it should not be a quaternion issue either. (I set the quaternion rotation in my ptam config file to 0,0,0,1)

Also, it seemed that the pose estimate was working well finally, but then quite often it seems that when it appears to converge, the prediction step suddenly starts throwing out NaN errors. A very good example of such a situation is towards the end of the rosbag attached.

Any further inputs?

Thanks a lot for your help!

p.s. Rosbag https://dl.dropboxusercontent.com/u/37626302/2013-04-21-02-53-48.bag

stephanweiss commented 11 years ago

Hi,

sorry for the very late reply. Things are a bit busy at the moment.

As far as I remember, the conflict only occurs on old compilers gcc 4.3 I think. Concerning the rest, note that the quaternion init is (w,x,y,z) and not (x,y,z,w). I am going through your bag file now.

Best Stephan


From: Kumar Shaurya Shankar [notifications@github.com] Sent: Thursday, May 16, 2013 10:31 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

I was going through the source code directory and just saw the IMPORTANT file in ssf_core that mentioned that the ROS PCL libraries conflict with EKF updates. That might be causing the problem. Has that issue been resolved yet?

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-18016193.

icoderaven commented 11 years ago

Hi Stephan

Thanks for replying! I did make progress on this - I think I manage to get the filter working, (i.e. the x,y and z coordinates look about right). And yes, I noticed that the quaternion was a [w x y z]. it_works ,but when I try plotting them using the pose output topic, I see very odd behaviour. Specifically, at every PTAM update the pose jumps to where it should be, and then drifts rapidly (see next figure; I was moving the quadrotor in circles, and the pose output drifts rapidly outwards until the next PTAM update when it jumps to another point). That seemed to suggest that the pose output is different from the state vector in the stateout message. huh

I saw that the IMU in your data was in the NWU frame, and the camera looking down was in the NED frame, so the appropriate quaternion transformation was a [0 1 0 0] quaternion. Since my 3DM-GX3-25 outputs IMU data in the NED frame, I flipped the accelerometer values, and rotated my orientation matrix with the above mentioned rotation transform (A diag(1,-1,-1)). I paused work on this since I started focusing on getting PTAM working well first. I shall look into it again in a few days.

As an additional side note, I couldn't get the ssf_update node running on the PandaBoard - on a first look it seemed like another ARM architecture based issue within ROS.

Thanks for responding!

simonlynen commented 11 years ago

Hi icoderaven,

what is the error you get with ssf_updates on ARM? Could you check if it is a buserror by attaching gdb?

In case you get a bus error, please use the fix described here:

https://code.ros.org/trac/ros/attachment/ticket/2883/serialization_memcpy_instead_of_reinterpret.diff The file is in ros_underlay/roscpp_core/roscpp_serialization/include/ros/serialization.h

Simon

icoderaven commented 11 years ago

Hi Simon Yes I had the bus error earlier with Ptam,and I have already made that patch. This was another bus error that was coming from elsewhere in serialization if memory serves me correctly. I'll look into it and let you know where it was.

stephanweiss commented 11 years ago

Hi,

Just saw your reply after having typed the email below, I post it anyway as reference for others as a small how-to for debugging. Also, can we move the ARM issue to another thread/issue with an appropriate title?

some remarks on your bag file: you seem to never have motion longer than 10-20seconds. Also, in the acceleration data there is at least one significant spike. During the debug phase, try to do the folling:

Record this data (IMU and PTAM, IMU at 100Hz is sufficient, PTAM as fast as possible). You can check the cam-IMU roation by analyzing the IMU angular velocities and take the derivative of the PTAM attitude. with this same data you can also verify the time delay between the two signals, true framerate, spikes, etc. If the error does not show itself by analyzing this data, please send the bag file.

I hope this helps, please let me know Best Stephan


From: Kumar Shaurya Shankar [notifications@github.com] Sent: Thursday, May 16, 2013 10:31 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

I was going through the source code directory and just saw the IMPORTANT file in ssf_core that mentioned that the ROS PCL libraries conflict with EKF updates. That might be causing the problem. Has that issue been resolved yet?

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-18016193.

stephanweiss commented 11 years ago

In your latest image I see "fuzzy tracking" in one of the consoles. Is your map robust and well initialized and stable under motion?

do you get continuous measurements on vslam/pose? with what frequency?

Best Stephan


From: Kumar Shaurya Shankar [notifications@github.com] Sent: Friday, June 07, 2013 1:12 PM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Simon Yes I had the bus error earlier with Ptam,and I have already made that patch, this was another bus error that was coming from elsewhere in serialization if memory serves me correctly. I'll look into it and let you know where it was.

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-19130238.

icoderaven commented 11 years ago

Hi Stephan Thanks a lot for the standard debugging procedure - it's really useful! I'll record a bag with this data soon. In the meanwhile, yes, I do have a stable map (I might have reset PTAM in the screenshot earlier), and I get PTAM updates at approximately 15 Hz. Right now I've shifted the camera to look down such that it's x axis points forward along the front(x) of the quadrotor.

icoderaven commented 11 years ago

Hi Stephan So I did create a bag and compared the angular velocity output of the PTAM pose with that provided by the IMU, and I see a very close match about Y and Z axes, but about X axis, the PTAM angular velocity estimate goes bonkers for some odd reason. I am using the Quaternion to Euler transformation in tf for converting the quaternion to an Euler angle, and then taking the difference between two consecutive poses and dividing that by their time difference. Please see attached plot. The one on the left is the calculated angular velocity, while the one on the right is the IMU message values. fusion_plots fusion_debug

Is this normal? Also, here's the bag file https://dl.dropboxusercontent.com/u/37626302/test.bag

Thanks a lot, yet again!

stephanweiss commented 11 years ago

Hi,

From the plots it is hard to tell if everything matches well, please plot an error plot or at least an overlay. you can compare the quaternions directly, this way you avoid potential issues while converting to euler angles. From the bag file I have the feeling that there are some hiccups in the pose estimate and the measurements do not arrive regularly. This could be because the map is not robust at every frame or because of time-sync issues. Are all your data time synched?

Best Stephan

devmax commented 11 years ago

You mention that your IMU assumes a z-axis upward, in which case the initial acceleration in the z-direction would be -9.8. However, instead of adding 9.8 to the initial value, you subtract 9.8, which would make the observed z -19.6 in case of zero motion...

is there something I'm missing?

stephanweiss commented 11 years ago

The IMU assumes a world z-axis pointing upwards. That is you have to apply a force of [mass]*9.81 to have the MAV in hover mode (simply speaking, the acceleration part is what the IMU is measuring).

Best Stephan


From: devmax [notifications@github.com] Sent: Tuesday, August 06, 2013 10:41 AM To: ethz-asl/ethzasl_sensor_fusion Cc: stephanweiss Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

You mention that your IMU assumes a z-axis upward, in which case the initial acceleration in the z-direction would be -9.8. However, instead of adding 9.8 to the initial value, you subtract 9.8, which would make the observed z -19.6 in case of zero motion...

is there something I'm missing?

— Reply to this email directly or view it on GitHubhttps://github.com/ethz-asl/ethzasl_sensor_fusion/issues/8#issuecomment-22196910.

devmax commented 11 years ago

I see, well just to be sure then,

  1. assuming the x-forward y-left co-ordinate system you use, if the drone travels to the left, should the acc. on y be positive or negative?
markusachtelik commented 11 years ago

It should be positive — hope that doesn't cause too much confusion ;)

devmax commented 11 years ago

Ahh so my ROS driver publishes faulty signs then- I suppose that is my problem, or one of them at least :)

Thank you!

icoderaven commented 11 years ago

Yeah, for instance my IMU reports accelerations in the X forward, Y right and Z downwards BODY frame, but the angular orientations in the NED frame (or, more precisely, a rotation matrix that when multiplied by a vector in the Earth Fixed system gives you the vector in the BODY frame (i.e. Vl = M*Ve) ) . As you can imagine, it gets pretty confusing trying to figure out what is the right way to go about matching what is on the Pelican, also because I don't know how/what frame the rotations are provided by the Pelican IMU.