ntnu-arl / gbplanner_ros

Graph-based Exploration Planner for Subterranean Environments
BSD 3-Clause "New" or "Revised" License
690 stars 153 forks source link

Planner service failed with real robot #49

Closed ajay1606 closed 6 months ago

ajay1606 commented 8 months ago

Hello,

With your continuous support, we were able to test this planner with bag_file and everything working as expected.

But, when i tried to test module with real robot (SPOT robot), getting error in like below.

image

[ INFO] [1711142413.262462715]: getLocalPointcloud time: 0.019490
[ERROR] [1711142413.262526028]: Exception thrown while processing service call: bool pcl::PCA<PointT>::initCompute() [with PointT = pcl::PointXYZI] in /usr/include/pcl-1.10/pcl/common/impl/pca.hpp @ 58 : [pcl::PCA::initCompute] number of points < 3
[ERROR] [1711142413.262815389]: Service call failed: service [/gbplanner] responded with an error: bool pcl::PCA<PointT>::initCompute() [with PointT = pcl::PointXYZI] in /usr/include/pcl-1.10/pcl/common/impl/pca.hpp @ 58 : [pcl::PCA::initCompute] number of points < 3
[ERROR] [1711142413.262846490]: Planner service failed

Also, In rviz error with marker visulization as below:

Screenshot from 2024-03-22 16-16-02

Any suggestions please !

MihirDharmadhikari commented 8 months ago

Hi @ajay1606 ,

It seems that the voxblox map is not generated. The error seems to arise as there aren't enough points in the point cloud used to generate the adaptive local bounding box. Can you check that the correct point cloud topic is passed to the gbplanner node, and there exists a tf between the frame in which the point cloud is published and the fixed frame? Also, the fixed frame used by gbplanner is world, check if you have the same in your SLAM. Otherwise, you can put a static transform between the two.

Best, Mihir

ajay1606 commented 8 months ago

@MihirDharmadhikari Thank you so much for your quick response. You are right there is problem with input LiDAR data. I just forgot to comment the param /use_sim_time:true, after setting it to false, able to see the data.

Though after getting data, still path planning not working. It just stop at this stage showing no feasible path found. But if i run in bag file, path planning working fine.

Screenshot from 2024-03-23 13-37-14

For robot control I am using same pure_persuit one that shared in your repository and i have modified the algorithm input by topic instead of action. Hence i have written call back inside pure_persuit to receive pci_command_path and compute cmd_velocity and carrot_point.

Currently planning_configuration file set with the following parameters:

run_mode: kReal
trigger_mode: kManual
robot_type: kGround
output_type: kTopic
init_motion_enable: false
planner_trigger_lead_time: 0.0
smooth_heading_enable: true
init_motion:
  z_takeoff: 2.5
  z_drop: 1.7
  x_forward: 3.0

# Group related parameters to be shared between nodes.
RobotDynamics:
  v_max: 0.3
  v_init_max: 0.3
  v_homing_max: 0.3
  yaw_rate_max: 0.5
  dt: 0.05

After moving robot little, able to see the trajectory like below: Screenshot from 2024-03-23 15-41-01

But the current point estimation is very much small to make robot make movement. So robot continuous to be remain in the same location. Would you please help to fix this issue !

Currently relying /carrot_point pose to control the robot (SPOT) using /spot/go_to_pose

Thanks in advance.

ajay1606 commented 8 months ago

@MihirDharmadhikari By any chance have you worked with SPOT robot using this planner ?

MihirDharmadhikari commented 8 months ago

Hi @ajay1606,

Before you trigger the first planning iteration, you need to move the robot ahead a little to have the area around the robot mapped. That is why the planner was triggered when you moved the robot. I would not recommend using the carrot_point for path tracking. As the entire path is published, I would recommend using a separate path tracker that has a look-ahead distance that you can tune based on your robot's controller. We haven't used the planner with the Spot so I am not sure what APIs it provides.

Best, Mihir

ajay1606 commented 7 months ago

@MihirDharmadhikari Thank you so much for response, after week of work around i able to control robot as supported from API. Currently API supports control robot from pose input rather than command velocity. Any how able to control robot according to trajectory output. But there is a another issue, in controlling robot with pose trajectory. As depicted in the below image, seems trajectory generated from the planner, is in the world coordinate. Hence when there is a turn, unable to get trajectory with respect to robot fixed coordinate. Would you please help me resolve this ! Seems i missed some fixed tf transformation ?

image

Currently odometry and sensor inputs are in world frame. Would greatly appreciate your support.

MihirDharmadhikari commented 7 months ago

Hi @ajay1606 ,

The point cloud that you send to the planner has to be in the sensor frame and not in the world frame. The map looks reasonable so I assume that you are sending the point cloud in the correct frame. The odometry has to be in the world frame as you correctly mention.

The pose messages that you show in the figure seem correct to me. When the robot turns the orientation field in the pose message reflects the same. Since you need the trajectory in the robot frame, at each control step, you will need to get the tf from the robot's base frame (the frame in which you want the trajectory) to the world frame and transform the trajectory in that frame. You can use the tf2 library which also has good tutorials on how to do it. If your SLAM solution does not publish a tf, then you will have to use the odometry topic and manually build the transformation matrix to do the transformation. Since you are able to build the map, I am guessing that your SLAM solution does publish the tf.

Let me know if this helps.

Best, Mihir

ajay1606 commented 7 months ago

@MihirDharmadhikari Thank you so much, and you are right. LiDAR : base_link Odometry: world

listener_->lookupTransform('base_link', 'world', ros::Time(0), fixed_to_robot_transform); Does the job.

MihirDharmadhikari commented 7 months ago

Glad to hear that it worked!

ajay1606 commented 7 months ago

@MihirDharmadhikari Is there any way i can save generated path ? And i again navigate on the saved trajectory ?

image

I would like test robot to follow pre-traversed trajectory. Would be very much helpful to tune our control algorithm.

MihirDharmadhikari commented 7 months ago

Currently, this functionality is not provided out of the box, you will have to implement it yourself. The easiest option would be to record the paths/trajectories in a rosbag and then replay the rosbag. This way you have the paths on the same topic and datatype.

AngelRodriguez8008 commented 6 months ago

Hi @ajay1606,

I will try to configure gbplanner in BD Spot as well. Thanks for share your findings. I think, you can save the path & map using the Spot GraphNav Service provided in the SDK.

ajay1606 commented 6 months ago

@Thanks for your response, I will look in to it.