team-fusionx / CarND-Capstone

Team FusionX - Capstone Project for Udacity Self-Driving Car Nanodegree
MIT License
8 stars 3 forks source link

Team FusionX - CarND Capstone Project

Udacity Self-Driving Car Nanodegree, 2018

Team Members

Project Overview

The final capstone project of the Udacity Self-Driving Car Nanodegree program provides the opportunity to run our code on a modified Lincoln MKZ hybrid vehicle (named "Carla") to autonomously drive around a test lot course with a set of mapped route waypoints and a traffic light.

The vehicle has a Dataspeed drive-by-wire (DBW) interface for throttle/brake/steering control, a forward-facing camera for traffic light detection, and LIDAR for localization (processed by Udacity to provide the car's current pose). The code is run on a Linux PC with Robot Operating System (ROS) and a TitanX GPU for TensorFlow processing.

Since the team members are working from multiple global locations (US, Canada, India), all development has been done online through Slack and Github team collaboration while using the Udacity vehicle simulator to integrate and prove out the algorithms. After being able to drive the simulator courses, the code is then tested on the real vehicle by Udacity engineers in California.

Video of FX-v3.2 Simulator Test:

Video of FX-v3.2 Site Test Result:

System Architecture

Image source: Udacity Project Overview lesson. Note: Obstacle Detection was not part of this project.

The autonomous control system architecture starts by loading mapped route base waypoints in the Planning area's Waypoint Loader and setting an overall max speed guard for each waypoint. This initial setup was provided by Udacity, to protect for safe operation in the test lot.

The system then starts receiving the car's sensor data (current pose from LIDAR localization, current speed, DBW enable switch, and camera image).

The Perception area's Traffic Light Detection Node processes the camera image to detect traffic lights to decide if and where the car needs to stop at an upcoming waypoint location.

The Planning area's Waypoint Updater Node plans the driving path target speed profile by setting upcoming waypoints with associated target speeds, including smoothly accelerating up to the target max speed and slowing down to stop at detected red lights.

The Control area's Waypoint Follower sets target linear velocity (from the planned waypoint target speeds) and target angular velocity (using Autoware's Pure Pursuit library algorithm to steer toward the waypoint path).

The Control area's DBW Node (Twist Controller) sets the throttle, brake, and steering commands using PID feedback control for throttle and brake, and kinematic bicycle model yaw control for steering. These commands are sent to the Dataspeed DBW system to actuate the car's pedals and steering wheel.

Area Task Primary Member Secondary Member Description
Perception Traffic Light Detection Shripad Meenu Train/implement neural network traffic light classifier, and determine the stopping locations for red lights
Planning Waypoint Loader - - Use Udacity provided base code
Planning Waypoint Updater Anthony Effendi Design and implement a smooth speed profile planner using Jerk Minimizing Trajectory (JMT) following dynamic red light stopping locations
Control Waypoint Follower Effendi Taylor Implement improvements to Autoware's base Pure Pursuit library to set target linear velocity and target angular velocity to follow upcoming waypoints
Control DBW (Twist Controller) Taylor Effendi Implement & tune PID feedback control with low pass filtering for throttle/brake commands and kinematic yaw control for steering command
Integration Simulation Testing Meenu Anthony Test & debug fully integrated control system with the simulation on a highway track and test lot course
Integration Real-world Image Testing Meenu Shripad Test & debug traffic light classifier with real-world camera images from recorded ROS bag data
Integration Visualization Tools Effendi Taylor Set up data visualization & analysis tools using ROS RQT with Multiplot plugin and RViz 3D scene viewer

Implementation Details

Perception

Planning

Control

Integration


Original setup instructions from Udacity base repo:

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.

Please use one of the two installation options, either native or docker installation.

Native Installation

Docker Installation

Install Docker

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Port Forwarding

To set up port forwarding, please refer to the instructions from term 2

Usage

  1. Clone the project repository

    git clone https://github.com/udacity/CarND-Capstone.git
  2. Install python dependencies

    cd CarND-Capstone
    pip install -r requirements.txt
  3. Make and run styx

    cd ros
    catkin_make
    source devel/setup.sh
    roslaunch launch/styx.launch
  4. Run the simulator

Real world testing

  1. Download training bag that was recorded on the Udacity self-driving car.
  2. Unzip the file
    unzip traffic_light_bag_file.zip
  3. Play the bag file
    rosbag play -l traffic_light_bag_file/traffic_light_training.bag
  4. Launch your project in site mode
    cd CarND-Capstone/ros
    roslaunch launch/site.launch
  5. Confirm that traffic light detection works on real life images