Open MorostCode opened 5 years ago
Hi @MorostCode, this gesture recognition code is setup with ROS (and so is Autoware), so the images being published from the camera while Autoware is running, will also be used by this software for classification of the gestures. The output gesture is classified as one of the following: UNKNOWN, STOP, GO, TURN_LEFT, TURN_RIGHT, PULL_OVER.
If this output is as expected (you can use some other parameters to ensure it is a safe action, like the speed of the vehicle, the location - highway/urban/rural, etc.), then you can pass the information to the Autoware Motion Planner and it will send the appropriate commands to the actuators and control the vehicle appropriately.
There are different types of Planners implemented in Autoware, including the A* Path Planner, the Dynamic Path Planner (DP), the Waypoint Follower, etc. The code for these can be viewed at https://github.com/CPFL/Autoware/tree/master/ros/src/computing/planning/motion/packages. Please let us know if you need any more information and I will try to provide it.
Rohan
could this code can run on the software i.e Pycharm ,if not ,can I figure somecode to run on it
I downloaded the dataset and project files, but as a newcomer to the autoware, I am sorry that I don't know how to make it work. Can you give me some advice?