Open Teja10 opened 3 years ago
We successfully completed controlling the car through keyborad using the donkey_car package that the senior design group from last year were able to build. However, when going from keyboard control to autonomous navigation, we haven't really figured out the right approach yet. Currently, when trying to sending a simple navigation goal through code, the the client waits for the server forever, so we need to figure out what's wrong.
Simple navigation goal c++: http://wiki.ros.org/navigation/Tutorials/SendingSimpleGoals Simple navigation goal python: https://hotblackrobotics.github.io/en/blog/2018/01/29/action-client-py/
We set up ROS bridge between the our own PC and TX2, so that rviz can receive graphic from the car. In addition, to solve the move_base issue, I checked the code for teleop_twist_board and low_level_control.py for keyboard navigation. We might need to write some code for move_base next week.
teleop_twist_keyboard: https://github.com/ros-teleop/teleop_twist_keyboard low_level_control: https://github.com/tizianofiorenzani/ros_tutorials/tree/master/donkey_car/src
First, the camera stream on rviz is successfully sending video stream to the host PC. It seems like if we use the "Add" button on rviz to add the topic "camera" and wait long enough, we will eventually get the video stream (although not entirely real-time with huge delay). Second, we successfully connected the car with rtabmap and move_base.
We launch the rtabmap first. It is connected to zed camera and publishes tf, map and odom information to /tf, /zed/map and /zed/zed_node/odom.
roslaunch zed-rtabmap-example zed-rtabmap.launch
Then, in order to control the car, we would need to plug in the ECS on the car and run the pwm on the board.
rosrun i2cpwm_board i2cpmw_board
When we see "Setting PWM frequency to 50 Hz", we know that the pwm is ready.
Note: because of the random performance of the car, it is recommended to test the car's throttle and steering before actually sending commands.
Afterwards, we would launch low_level_control, which is inside the "keyboard_demo.launch" file and connects /cmd_vel output to the actual motor of the car.
roslaunch donkey_car keyboard_demo.launch
In the end, we would launch the move_base file.
roslaunch champ_control move_base.launch
This file incorporates the global and local parameters that we would like to set, and possibly needs some revision.
Then we set goal using the "2D Nav Goal" button on rviz to send goals to the car, and keyboard_demo will show the speed and steering that it is suppose to be sending.
Current Problems:
The car can currently move by itself now! Regarding the problems last week:
Current Problems:
Overall, the car is performing some kinds of autonomous driving. The path and map is also visible on another computer when TX2 is not connected to HDMI port. The other computer can also set goals after TX2 is disconnected to the HDMI (only once though). So we pretty much simulated the results from last year.
We weren't able to implement due to the power outage happened during the week, but I tried starting installing donkey car like they did last year. Had trouble connecting through ssh and will continue working on it. http://docs.donkeycar.com/guide/robot_sbc/setup_jetson_nano/#step-4-install-donkeycar-python-code https://medium.com/@feicheung2016/getting-started-with-jetson-nano-and-autonomous-donkey-car-d4f25bbd1c83 https://medium.com/@heldenkombinat/getting-started-with-the-jetson-nano-37af65a07aab#f208