yoyo802088 / ARB-Y

RoveCrest project for the EEC193AB course and University Rover Challenge
0 stars 0 forks source link

Duplicate results from last year. Figure out potential improvements. #7

Open Teja10 opened 3 years ago

reginazhai commented 3 years ago

We weren't able to implement due to the power outage happened during the week, but I tried starting installing donkey car like they did last year. Had trouble connecting through ssh and will continue working on it. http://docs.donkeycar.com/guide/robot_sbc/setup_jetson_nano/#step-4-install-donkeycar-python-code https://medium.com/@feicheung2016/getting-started-with-jetson-nano-and-autonomous-donkey-car-d4f25bbd1c83 https://medium.com/@heldenkombinat/getting-started-with-the-jetson-nano-37af65a07aab#f208

reginazhai commented 3 years ago

We successfully completed controlling the car through keyborad using the donkey_car package that the senior design group from last year were able to build. However, when going from keyboard control to autonomous navigation, we haven't really figured out the right approach yet. Currently, when trying to sending a simple navigation goal through code, the the client waits for the server forever, so we need to figure out what's wrong.

Simple navigation goal c++: http://wiki.ros.org/navigation/Tutorials/SendingSimpleGoals Simple navigation goal python: https://hotblackrobotics.github.io/en/blog/2018/01/29/action-client-py/

reginazhai commented 3 years ago

We set up ROS bridge between the our own PC and TX2, so that rviz can receive graphic from the car. In addition, to solve the move_base issue, I checked the code for teleop_twist_board and low_level_control.py for keyboard navigation. We might need to write some code for move_base next week.

teleop_twist_keyboard: https://github.com/ros-teleop/teleop_twist_keyboard low_level_control: https://github.com/tizianofiorenzani/ros_tutorials/tree/master/donkey_car/src

reginazhai commented 3 years ago

First, the camera stream on rviz is successfully sending video stream to the host PC. It seems like if we use the "Add" button on rviz to add the topic "camera" and wait long enough, we will eventually get the video stream (although not entirely real-time with huge delay). Second, we successfully connected the car with rtabmap and move_base.

We launch the rtabmap first. It is connected to zed camera and publishes tf, map and odom information to /tf, /zed/map and /zed/zed_node/odom. roslaunch zed-rtabmap-example zed-rtabmap.launch

Then, in order to control the car, we would need to plug in the ECS on the car and run the pwm on the board. rosrun i2cpwm_board i2cpmw_board When we see "Setting PWM frequency to 50 Hz", we know that the pwm is ready. Note: because of the random performance of the car, it is recommended to test the car's throttle and steering before actually sending commands.

Afterwards, we would launch low_level_control, which is inside the "keyboard_demo.launch" file and connects /cmd_vel output to the actual motor of the car. roslaunch donkey_car keyboard_demo.launch

In the end, we would launch the move_base file. roslaunch champ_control move_base.launch This file incorporates the global and local parameters that we would like to set, and possibly needs some revision.

Then we set goal using the "2D Nav Goal" button on rviz to send goals to the car, and keyboard_demo will show the speed and steering that it is suppose to be sending.

Current Problems:

  1. Still cannot see the path that move_base planed on rviz.
  2. The command sent to the car does not correspond very nicely to the goal that we set
  3. We know that keyboard_demo would set speed and steering to idle after several seconds of no operation, but it is somehow not showing up here. We might need more investigation into it.
  4. There are a lot of files that we have to launch. We would need to incorporate all of them into a single file, or at least less files than now.
reginazhai commented 3 years ago

The car can currently move by itself now! Regarding the problems last week:

  1. We solved this by adding the topic on rviz. Specifically, pressing "Add" and adding "/move_base/NavfnROS/plan" with type "Path" (for the planned path), and adding "/move_base/current_goal" with type "Goal" (for the current goal display).
  2. It turns out that it's a hardware problem. We have to make sure the ECS light is red so that the goal can be properly sent. It normally turns red after i2cpwm_board is connected. If not, we can use throttleControl to tune and see how it turns out.
  3. Apparently move_base has its way of showing it's dead, which is having throttle to 0 and steering to 2 (all the way left).
  4. We compressed everything into a single launch file, and we can set the goal using rviz.

Current Problems:

  1. It seems that the car wants to look around when it's stuck, but the camera is fixed, so we need some ways of connecting the orientation of the camera to the steering of the wheels (probably something to do with the axis).
  2. move_base seems like only able to send one goal. We need to find a way of sending goals constantly and figure out how to cancel the goals already published.

Overall, the car is performing some kinds of autonomous driving. The path and map is also visible on another computer when TX2 is not connected to HDMI port. The other computer can also set goals after TX2 is disconnected to the HDMI (only once though). So we pretty much simulated the results from last year.