tomasvr / turtlebot3_drlnav

A ROS2-based framework for TurtleBot3 DRL autonomous navigation
128 stars 18 forks source link

Hints on transferring a model on a real turtlebot/robot #2

Closed loikun closed 5 months ago

loikun commented 1 year ago

Thanks for this DRL project running on ubuntu Focal (the only one I found so far running under Focal, very useful).

My question is about how to transfer a trained model on a real turtlebot.

Am I right ? Is it the correct way? I am thankful for any input or code skeleton. Best regard,

tomasvr commented 1 year ago

That is correct. In fact, I ran both the drl_environment and drl_agent nodes on the remote PC for simplicity where the robot was only providing the remote PC with LiDAR scan and odometry information. This can however result in some latency issues if you are sending a large amount of scans over the network, so you need to ensure you have a stable LAN connection.

It is of course possible to run all of the nodes on the robot itself if the board computer is powerful enough. This would be preferable to omit the need for sending data over LAN.

I will upload some additional code and instructions for running the model on a physical robot in a few days.

tomasvr commented 1 year ago

I have improved the support for running on a real robot with commit 887153b. That should make it easier to get started, you can now use real_environment and real_agent as described in the documentation to get running quickly.

loikun commented 1 year ago

Many thanks. I will test it before the end of the week. I had just started a few days ago to adapt the code to the turtlebot3. I begun with downsampling the 360 points per LIDAR revolution of the Turtlebot3 in order to match the 40 points of the simulation and thus the trained networks. But now I will switch to your implementation and test it.

LihanChen2004 commented 9 months ago

Many thanks. I will test it before the end of the week. I had just started a few days ago to adapt the code to the turtlebot3. I begun with downsampling the 360 points per LIDAR revolution of the Turtlebot3 in order to match the 40 points of the simulation and thus the trained networks. But now I will switch to your implementation and test it.

@loikun I would like to know how you downsample the radar? Could you recommend some open source packages for radar downsampling?

loikun commented 9 months ago

To experiment on the real turtlebot3 without modifying the simulation and training, we have updated the function scan_callback(self, msg) in drl_environment_real.py. This update aims at :

  1. downsample the Lidar msg.ranges (360 values) to only REAL_N_SCAN_SAMPLES values. They are stored in a new array self.scan_ranges
  2. then remove any incorrect value from self.scan_ranges, i.e value outside the normal range of the Lidar (0.11-3.5 meter). In particular, in the absence of obstacles within 3.5m, the current ROS Lidar driver reports 0 instead of previously +inf. Thus, values close to 0 have to be ignored. (The ROS conventions say that any value outside the range of the sensor have to be ignored or considered invalid.)

Real experiments on the turtbot3 were not very reliable in our case based on the training in simulation : sometimes it works, sometimes not. We plan to investigate again in the next months.

Our updated code to be compared to the original code:

 def scan_callback(self, msg):
        if len(msg.ranges) > REAL_N_SCAN_SAMPLES:           
            self.scan_ranges=msg.ranges[::int(len(msg.ranges)/REAL_N_SCAN_SAMPLES)] 
        if len(self.scan_ranges) != REAL_N_SCAN_SAMPLES:
            print(f"more or less scans than expected! check model.sdf, got: {len(msg.ranges)}, expected: {REAL_N_SCAN_SAMPLES}")
        self.scan_ranges = [float('inf') if x<0.05 else x for x in self.scan_ranges]    
        ## normalize laser values
        self.obstacle_distance = 1
        for i in range(REAL_N_SCAN_SAMPLES):
                self.scan_ranges[i] = numpy.clip(float(self.scan_ranges[i] - REAL_LIDAR_CORRECTION) / REAL_LIDAR_DISTANCE_CAP, 0, 1)
                if self.scan_ranges[i] < self.obstacle_distance:
                    self.obstacle_distance = self.scan_ranges[i]
        self.obstacle_distance *= REAL_LIDAR_DISTANCE_CAP
LihanChen2004 commented 9 months ago

Thank you very much! We've solved the problem with your downsampling program.

Also, what you mentioned sometimes it works, sometimes not, and we encountered it at the beginning.

Our solution was to replace the radar with the RPLidar S1 because the lds01 radar on the turtlebot3 sometimes detects very close but non-existent obstacles.

We also noticed that the orientation of the radar installation affected the navigation, and our radar was oriented towards turtlebot's two wheels.

IMG_7396

We changed the parameters of realrobot in setting.py, which may also affect the obstacle avoidance effect

REAL_N_SCAN_SAMPLES         = 40    # LiDAR density count your robot is providing
REAL_ARENA_LENGTH           = 10   # meters
REAL_ARENA_WIDTH            = 10   # meters
REAL_SPEED_LINEAR_MAX       = 0.3  # in m/s
REAL_SPEED_ANGULAR_MAX      = 2.0   # in rad/s

REAL_LIDAR_CORRECTION       = 0.1  # meters, subtracted from the real LiDAR values
REAL_LIDAR_DISTANCE_CAP     = 10.5   # meters, scan distances are capped this value
REAL_THRESHOLD_COLLISION    = 0.005  # meters, minimum distance to an object that counts as a collision
REAL_THRESHOLD_GOAL         = 0.35  # meters, minimum distance to goal that counts as reaching the goal

Our TurtleBot is currently running the repository's ddpg_0_stage9_episode8000 and can avoid obstacles successfully. However, there are still issues with navigating to specified target points. We suspect that it may be related to the coordinate system.