shinkansan / 2019-UGRP-DPoom

2019 DGIST DPoom project under UGRP : SBC and RGB-D camera based full autonomous driving system for mobile robot with indoor SLAM
https://shinkansan.github.io/2019-UGRP-DPoom/
Apache License 2.0
68 stars 17 forks source link

Info about path planning module #4

Closed germal closed 4 years ago

germal commented 4 years ago

Hi ,

After generating the map with the d435i and the activating of the localization node, how is it possible to use the path planning module ? Thanks a lot Germal

SeunghyunLim commented 4 years ago

Hi @germal If you want to use the path planning module separately, you should prepare a 2D map image, just like the lobby.jpg. (This map image can be obtained from SLAM, or handcrafted map, etc.)

In Astar.py, img2binList converts this image to a binary list so that it can be used for the pathplanning. So you can generate your path with _yourmap.jpg by

pathplanning(start, end, image_path="_your_map.jpg", verbose=1)

in Astar.py, and also this image should be placed in the same directory with Astar.py

germal commented 4 years ago

Hello @SeunghyunLim thank you very much for your kind explanation ! So your pathplanning module that you have created works independently from the ros navigation ( costmaps and goal reaching) or can be integrated with that ? Thanks a lot

SeunghyunLim commented 4 years ago

Hi @germal If the ros navigation means the navigation package on ROS wiki, our module works independently of it, because we didn't use that package on our project. Thanks!

germal commented 4 years ago

iThank you @SeunghyunLim ! How would be possible to send the goal position to autodrive.py and let the robot reach it autonomusly ? How is it possible to integrate Morp obstacle detection with the autonomous navigation part ? Thanks a lot for your help

SeunghyunLim commented 4 years ago

Hi @germal!

For your first question, we import Astar.py on autodrive.py so that we can generate the path from start to end position. On line 188 in autodrive.py, you can check the destination coordinate. (With some other codes for generating random-based goal position, you can set the goal randomly. For example, Walkable_area_extraction.py) We designed the driving module dap.py, which moves the robot along the given path, so we inserted the generated path to the function dap from dap.py. Please check the line 132 in autodrive.py

For the second question, DRIFAT.py is the system integration code for our robot. We separated the handle module into 'autodrive' and 'morp' in _handlemaster, So, to make it simple, we run slam thread first, and then load autodrive and morp threads, and pass the handle to the specific mode according to the environment condition, using _morp_daptransfer thread. Thanks!

germal commented 4 years ago

Hi @SeunghyunLim , I am really grateful with you for this explanation and for this great project ! Now I grasp the basics to experiment with it. cheers, germal

Shrioe commented 4 months ago

Hi,

I am following the instructions of "SLAM with D435i tutorial" roslaunch realsense2_camera opensource_tracking.launch,

I encountered some errors as shown below:

ERROR] [1718263335.611787184]: Critical Error, NaNs were detected in the output state of the filter. This was likely due to poorly conditioned process, noise, or sensor covariances. [ WARN] [1718263335.612158811]: Failed to meet update rate! Took 0.0086891700000000012677 Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "camera_link" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (-nan nan -nan nan)          at line 240 in /tmp/binarydeb/ros-noetic-tf2-0.7.7/src/buffer_core.cpp Error:   TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "camera_link" from authority "unknown_publisher" because of an invalid quaternion in the transform (-nan nan -nan nan)          at line 255 in /tmp/binarydeb/ros-noetic-tf2-0.7.7/src/buffer_core.cpp

And I also faced issue when trying to replay the rosbag file after running the line: "roslaunch realsense2_camera opensource_tracking.launch offline:=true" , whereby it will only open a new rviz file without showing anything. May I ask what am I supposed to get at this step?

Could you please provide some suggestions?

Thank you!