Closed at-wat closed 4 years ago
Frames between odom -> base_link -> sensor_links can be static by using static_transform_publisher. However, odometry topic should be provided at least containing orientation data. Otherwise, number of the particles and noise parameters must be increased pretty much.
Rviz publishes only on fixed frame with 2D covariance. (not distributing in z direction) In my case of the robot system, I made static frames on each floor and selected current floor as fixed frame before setting the pose estimation.
I see you commented that the localisation is not very accurate without odometry. I should be able to get the orientation from my IMU and maybe I can also simulate odometry with it.
The accuracy depends on the sensor configuration, and the posted screenshots are almost in the worst case because of its very narrow vertical sensor view due to the low z-position.
Adding use_neonavigation:=true use_neonavigation:=true
to the test.launch enables this feature. This requires neonavigation meta-package.
track_odometry
node in the meta-package, which is for combining raw odometry and imu with estimating slippage, can translate from imu to odometry.
Another more elegant way without using neonavigation meta-package might be:
However, imu_transformer node seems currently broken (ros-perception/imu_pipeline#8), so using neonavigation meta-package is easier for now.
When I'm using neonavigation I only rely on scan matching, right? It uses neither the linear acceleration nor the angular velocity or the orientation if is see it correctly. However I think my structures are too far away sometimes. only in a corner of my map it moves and rotates merely acceptable.
Maybe it can get better by adjusting the parameters you added to the launch file. Can you please tell me about the following. I hope they might help:
<param name="/mcl_3dl/num_particles" value="256" />
<param name="/mcl_3dl/resample_var_x/y/z" value="0.2" />
<param name="/mcl_3dl/resample_var_roll/pitch/yaw" value="0.05" />
<param name="/mcl_3dl/match_dist_min" value="0.8" />
<param name="/mcl_3dl/lpf_step" value="8.0" />
you sya that track_odometry node in the meta-package, which is for combining raw odometry and imu with estimating slippage, can translate from imu to odometry.
But I want to know if we can totallly don't need /odom data including /odom_raw.
To me it seems that we can go without odometry if we are located indoors with big parts of the pcl on walls. If there are only few parts vertical to my movement direction mcl_3dl wont recognise that. only where I'm turning slowly in a corner it works for a moment. I'm not using any odometry I only publish a static IMU and Lidar to base_link transformation.
num_particles
: this is the number of particles in particle filter. Without odometry, distribution of the estimated pose will be larger, so this parameter should be larger than one with odometry.resample_var_x/y/z, resample_var_roll/pitch/yaw
[meters]: in the resampling phase of the node spreads copied particles with the distribution specified by these parameters. Large values of these parameters absorb unpredictable motion. (In this case, translational motions without odometry.)match_dist_min
[meters]: distance larger than this value in the matching between measured point and the map is ignored. Without odometry, matching must have large tolerance since the motion is not directly predictable.lpf_step
: this node outputs map
->odom
TF data. This transform output is 1st-order low pass filtered with the time-constant of this value, since higher frequency components are given by the odometry is we have. So, this value should be smaller than the configuration with odometry. (This parameter never affect the localization its-self. It is just for the appearance of the output.)trach_odometry
now have without_odom:=true
option. It just translates imu data into base_link
frame and fills odom.pose.pose.orientation
field.
With this option, this works only with IMU data (with imu.orientation
field).
Adding
use_neonavigation:=true use_neonavigation:=true
to the test.launch enables this feature. This requires neonavigation meta-package.
This was my copy-and-paste mistake. Passing use_neonavigation:=true without_odom:=true
to test.launch is correct.
A lot of things depend on the sensor configuration and environment. If you can share your sample bag file, I will see its usability and tune some parameters. It might be much easier and faster, I guess.
Thank you about replying me.Now ,I am recording myself bag file ,but have some problem "rosbag record buffer exceeded. Dropping oldest queued message. " Because the mapcloud of mine which raw size is 149MB is big.
Independent map file (e.g. in pcd or ply format) is also fine for me if there is a problem on packing it.
yeah, now I see yout /tf topic is so complex .How can I create mine?
My sample data contains a lot of sensor and actuator frames of our robot system. base_link to IMU and LIDAR(s) are required in this case.
Do you have a urdf or launch file describing static_transform_publishers? If it is available, you don't have to pack /tf topic in the bag.
Ok , so I write a static_transform_publishers.
I have set static transform about /odom to /baselink , /baselink to /imu , /baselink to /lidar , but how to set /map to /odom ?
/map to /odom will be estimated and broadcasted by the node.
Lookup would require extrapolation into the past. Requested time 1489466654.064562082 but the earliest data is at time 1493269462.491682927, when looking up transform from frame [lidar] to frame [odom]
Nodes are running on Unixtime of 1489466654 (14 Mar 2017), but your data is on 1493269462 (27 Apr 2017). I guess your computer clock is wrong.
Also, if you are using rosbag play, see http://answers.ros.org/question/12577/when-should-i-need-clock-parameter-on-rosbag-play/.
when I use Track_odometry node to translates imu data into base_link frame and fills odom.pose.pose field , I find that the value odom.pose.pose.postion and odom.pose.pose.orientation is all nan. I have set without_odometry=true in launch file.
Could you share your bag file and static transforms with me by uploading somewhere? Seeing the actual data and settings is easier to find the problem.
OK, I am thinking to upload where for you. My imu data's imu.orientation filed don't have value.
Please include raw imu data and pointcloud. Also, your launch file with static transforms are necessary.
This is data downloading link. Thank you. https://pan.baidu.com/s/1c1UI7Zi
I saw the data just now. Could you make a bag file, containing imu data and point cloud from the sensors, instead of the pcd files of the scans and text file? Current file format is complicated to be processed.
pcd file which I need is a map pointcloud which might be generated by SLAM.
New data link: https://pan.baidu.com/s/1crN5fC
The issue seems to be caused by the packet size of input cloud. Since input cloud packet in my case has complete yaw scan, the node updates particles for every arrival of input cloud.
For velodyne, the node must accumulate clouds to reconstruct 360 deg. of scan. It seems easy to implement a fix, however, I don't have time until the end of this month.
Currently, the node accumulates scans from multiple LIDARs (around https://github.com/at-wat/mcl_3dl/blob/master/src/mcl_3dl.cpp#L473), the fix will be placed around here.
Setting accum_cloud
parameter, which is implemented by #60, to be the number of the packet (PointCloud2 message) within one scan of the LIDAR might fix the behavior for Velodyne.
Fake odometry mode, that uses IMU poses as odometry data, is added by #234. Set fake_odom true to enable this, so, it is no longer required to use neonavigation/track_odometry to do this.
This takes over https://groups.google.com/forum/?utm_medium=email&utm_source=footer#!topic/google-cartographer/ea7bdn0wmvc.