reiniscimurs / DRL-robot-navigation

Deep Reinforcement Learning for mobile robot navigation in ROS Gazebo simulator. Using Twin Delayed Deep Deterministic Policy Gradient (TD3) neural network, a robot learns to navigate to a random goal point in a simulated environment while avoiding obstacles.
MIT License
625 stars 123 forks source link

Rviz issue #5

Closed reiniscimurs closed 2 years ago

reiniscimurs commented 2 years ago

Hi, Thank you for sharing your code. I'm new to ROS/Gazebo. I have installed ROS noetic and played with some tutorials. We are aiming to develop some path planning algorithms inspired by hemispheric lateralization, I found your work interesting. Currently, I'm trying to replicate path planning using your code. To make the things simple, I have just loaded the Gazebo environment. It is observed that the robot does not move in the rviz, however, it is moving in Gazebo. The robot model, camera, and laser scan all show status error, e.g. No transform from [r1/base_link] to [base_link]. Thanks in advance.

my file velodyne_ab.py looks like

############################################# import os import time import numpy as np from velodyne_env import GazeboEnv

variables initialization

----

env = GazeboEnv('multi_robot_scenario.launch', 1, 1, 1)

for i in range(10000): print("AB Step") if(i%100 == 0 and i > 0): env.reset()

action = (1 + np.random.normal(0, expl_noise, size=action_dim)).clip(-max_action, max_action)
a_in = [(action[0] + 1) / 2, action[1]]
next_state, reward, done, target = env.step(a_in)

Originally posted by @AbuZalayedHassan in https://github.com/reiniscimurs/DRL-robot-navigation/issues/1#issuecomment-982048248

reiniscimurs commented 2 years ago

Hi @AbuZalayedHassan,

Could you post a picture of your output of rviz or the log from the terminal? It's difficult to say what could be the issue without knowing more about the problem.

In the meantime, can you check what is the output of your rqt graph? Also check what is your Fixed Frame in Global Options in Rviz?

DevAB-Git commented 2 years ago

Hi Reiniscimurs,

Thank you very much for your reply. 1- The picture of rviz and terminal logs are attached herewith. 2- The output of rqt graph is also attached. 3- The fixed frame is base_link rosgraph rviz1 rviz2 terminal-log.txt

Please let me know if any other information is required.

reiniscimurs commented 2 years ago

The current implementation is built upon ROS Melodic. As far as I can tell, the robot_state_publisher and tf packages have changed between the ROS versions and they do not allow tf_prefix, which kinda screws up the whole tf tree here. I played around with the Noetic version a bit, and if you remove the tf_prefix and name_space you can get some visual output in the Rviz, but it does not work quite well.

I will look into it, but generally don't think I will invest a lot of time to make it work for Noetic. If you come up with a solution, please let me know or push a contribution to the repo. Would be appreciated.

In the meantime - either use ROS Melodic or ignore the Rviz

DevAB-Git commented 2 years ago

Hi Reiniscimurs,

Thank you for your time. In addition to the packages, mentioned by you, message_to_tf is also not working in Noetic. I have removed tf_prefix, consequently, the camera start working but the robot is still not moving in the rviz. Moreover, I could not find name_space. Could you please mention the file(s) that contains name_space so that I can remove it as well?

Furthermore, if I ignore Rviz, does it make any difference in learning the policy using the path-planning algorithm?

Thanks in advance

reiniscimurs commented 2 years ago

The name space tag is the <group ns="$(arg robot_name)"> in the pioneer3dx.gazebo.launch file

state_publisher also changes to robot_state_publisher in the Noetic version. I managed to get it working for all the fixed objects but all the continuous joints do not want to connect in the tf tree. This leads me to believe that something is off with the joint_state_publisher frames

Will investigate further when I have the time

reiniscimurs commented 2 years ago

Furthermore, if I ignore Rviz, does it make any difference in learning the policy using the path-planning algorithm?

I cannot be sure, as I don't know what you mean by learning policy using path-planning algorithm. As it appears to me that the learning is happening, just the visualization is broken, but I have not trained the network on Noetic so cannot say anything for certain.

DevAB-Git commented 2 years ago

Thank you very much for all the support. Please share your solution if you managed to run it using ROS Noetic. Many people have problems while moving from Melodic to Noetic. It will help the community, especially new researchers, to learn quickly. Definitely, we will acknowledge your support and site your work.

reiniscimurs commented 2 years ago

I think I mainly managed to solve it by explicitly starting the joint_state_publisher node in pioneer3dx.gazebo.launch and pioneer3dx.urdf.launch files. That does the trick and adds the continuous joints to the tf tree. frames

Check out the Noetic branch and it would be good if you can test it out and see if it works not only on my setup. There does appear to be an issue with laser data readings, but I will approach that later.

DevAB-Git commented 2 years ago

Hi,

Thank you for your time and support. I have tested the Noetic branch and it is working for me :)

reiniscimurs commented 2 years ago

Great! Do you have an issue from time to time with an error from the laser sensor?

DevAB-Git commented 2 years ago

Currently, I'm playing with simple hard-coded routines that enable the robot to follow a specific path. I'll explore further and update you later.

reiniscimurs commented 2 years ago

I have updated the way the sensor data is read and am not witnessing any more issues, so I will consider the Noetic branch up and running and close the issue.

marwaSaid commented 2 years ago

solved