Closed klemense1 closed 4 years ago
Terminology in Carla:
Actor:
All the object that plays a role in Carla simulation
Blueprint:
Every actor needs a blueprint to define itself
The pre-defined attributes of the type of actor can be tuned in blueprint
World:
3D Representation of a OpenDRIVE map
Carla is divided into server and client, Server and Client communicate using TCP.
Server roles:
• Run simulation
• Send out the data/information of the simulation
Client roles:
• Request and read the data/information of the simulation
• Control the world (restart, record, replay,…)
• Send execution commands (e.g. movement of vehicles in the next time step)
• RGBD camera
• Semantics segmentation
• Lidar
• Collision
• Lane invasion
• GNSS (location as defined in OpenDRIVE map)
Properties:
○ Sensors can be used by attaching to vehicles.
○ The data of some sensors running on GPU (camera & Lidar) are slow to read and send out.
○ Reproducible iff Synchronous mode is enabled
Reproducible:
Fixed time-step mode:
The simulation-time increments with a fixed delta time, progress regardless if commands from a client are received.
The smaller the time-step, the more realistic the physical simulation it will be, at the same time the simulation will run slower (simulation time-step will be slower than real-time), and vice verse.
Synchronous mode:
The simulation will halt until a "tick message" is received from a client at each step.
ROS bridge only runs with this mode.
Deterministic:
Vehicle dynamics:
• Carla uses Nvidia PhysX vehicle model.
• PhysX is expected to produce deterministic result. (but depends on a few factors.[ In Carla it is quite repeatable](https://github.com/carla-simulator/carla/issues/798))
• [Physical attributes of vehicles](https://carla.readthedocs.io/en/latest/python_api/#carlavehiclephysicscontrol-class) can be set
Python implementation:
• All openDRIVE maps in Carla are generated by a software called "RoadRunner"
• All the environments in Carla are generated with OpenDRIVE file (.xodr) and some binaries (.fbx)
• OpenDRIVE files are saved in disk.
• New maps can be generated into UE4 readable file, which is used in Carla as a representation of the world, with the mentioned two type of files
Map & waypoint: Class with higher level API operates on OpenDRIVE map
Map's method related to Waypoint:
○ Get the waypoint with a given location inside a radius
○ Get the topology of the whole map (a list of connected waypoints)
The location of map in Carla is represented as waypoints, a discretization of the space.
Every Waypoint has an hashed ID based on road_id, section_id, lane_id and coordinate,
Two Waypoints retrieved within a distance (default: 2cm) have the same ID.
Vehicles:
Control: Only raw input are available, doesn't support inputting trajectory, point location, etc.
Retrievable data (actors, waypoint):
• Sensors data
• Actor
○ 3D Location (exact)
○ Orientation
○ Velocity
○ Angular velocity
○ Acceleration
• Waypoint
○ Junction
○ Lane width
○ Road id
○ Section id
○ Lane id
○ Left & right lane
○ State of traffic light
C++ implementation: Carla system architecture
The python API is based on C++ implementation
Pros:
The underlying code of implement of python classes can be modified if needed
Pre-compiled version (2.x GB) is available
Cons:
All the library including Unreal engine need to be compiled as well (~15GB)
Give access to:
• Sensors' data
• Marker/bounding box of vehicles & pedestrians
• Transform(orientation?)
• Control of actors (vehicles)
• States of traffic lights
No implementation of map & waypoint comparing to the python implementation
Must run on synchronize mode
In our last meeting, we chose the python interface to work with for now.
@tin1254 Could you elaborate on the prediction that's available? I know we discussed it in the last meeting, but there must be something implemented for the auto-pilot controlled vehicles.
@klemense1 I looked into the source codes and didn't find any prediction function implemented, if you mean predicting other agent's behavior in the upcoming time step.
For the autopilot, it is hardcoded and doesn't use any AI technique. The route is random and the decisions (accelerate or brake) it makes depending on only two factors or information at each time step:
source code (function TickAutopilotController) https://github.com/carla-simulator/carla/blob/master/Unreal/CarlaUE4/Plugins/Carla/Source/Carla/Vehicle/WheeledVehicleAIController.cpp
Carla Interface (Python) vs Ros Bridge vs some Cpp Interface? Pro's and Con's?
what do they implement? Map/ Geometry, Localization, Object Lists (static and dynamic)
what else about other agents? some predictions/ intentions?
how to extract the data from those three implementations?
are there any lesson's learned from the carla challenge?
is it deterministic / reproducable? if you execute an action, do we always get the same state in carla? (have a look into the issues in carla / challenge)