huawei-noah / SMARTS

Scalable Multi-Agent RL Training School for Autonomous Driving
MIT License
952 stars 190 forks source link

Hi, thanks! I've gone through the offline example, I am wondering if there is an available way in acquiring the ```Waypoint paths``` for neighborhood vehicles. #1634

Closed georgeliu233 closed 2 years ago

georgeliu233 commented 2 years ago

Hi, thanks! I've gone through the offline example, I am wondering if there is an available way in acquiring the Waypoint paths for neighborhood vehicles.

Originally posted by @georgeliu233 in https://github.com/huawei-noah/SMARTS/issues/1618#issuecomment-1250541950

georgeliu233 commented 2 years ago

Hi, FYI, for each /scenario_ids folder, will it be possible to also contain all the surrounding vehicles each as one .pkl in the scenario?

Appreciate for your kindly help!

saulfield commented 2 years ago

Hi @georgeliu233,

It sounds like you want to record observations for all vehicles in the scenario. You can do this by calling the traffic_histories_to_observations.py script with the -v option, followed by all the vehicle IDs you wish to record. There will be a .pkl file for each one, and they will each have waypoint_paths. I hope this answers your question.

georgeliu233 commented 2 years ago

Hi @georgeliu233,

It sounds like you want to record observations for all vehicles in the scenario. You can do this by calling the traffic_histories_to_observations.py script with the -v option, followed by all the vehicle IDs you wish to record. There will be a .pkl file for each one, and they will each have waypoint_paths. I hope this answers your question.

Thanks! I am wondering whether the hidden training dataset would contain all vehicles?

Adaickalavan commented 2 years ago

Hi @georgeliu233,

  1. The hidden training dataset only contains .pkl files, consisting of SMARTS observation at each time point, for the ego agents. For example, in a multi-agent scenario where the user needs to control two ego agents, the <scenario_id> folder will contain two .pkl files, one for each ego agent.
  2. After your offline model is trained, it will be evaluated in a manner identical to that of Track1. During evaluation, the environment only returns SMARTS observation at each time point for the ego agents only. Hence, you will not have information on future trajectories of neighbourhood vehicles. You may deduce the past trajectories of neighbourhood vehicles, albeit probably incompletely, by looking at the ego's observation.neighborhood_vehicle_states[<neighbor>].position .
  3. The SMARTS observation of each ego agent at each time point does not describe the future trajectories of neighbourhood vehicles as it is not reasonable to exactly know this information in real life.
  4. For your own development and training purposes, you may generate .pkl files, consisting of SMARTS observation at each time point, for all vehicles following the instructions given above.
georgeliu233 commented 2 years ago

Hi @georgeliu233,

  1. The hidden training dataset only contains .pkl files, consisting of SMARTS observation at each time point, for the ego agents. For example, in a multi-agent scenario where the user needs to control two ego agents, the <scenario_id> folder will contain two .pkl files, one for each ego agent.
  2. After your offline model is trained, it will be evaluated in a manner identical to that of Track1. During evaluation, the environment only returns SMARTS observation at each time point for the ego agents only. Hence, you will not have information on future trajectories of neighbourhood vehicles. You may deduce the past trajectories of neighbourhood vehicles, albeit probably incompletely, by looking at the ego's observation.neighborhood_vehicle_states[<neighbor>].position .
  3. The SMARTS observation of each ego agent at each time point does not describe the future trajectories of neighbourhood vehicles as it is not reasonable to exactly know this information in real life.
  4. For your own development and training purposes, you may generate .pkl files, consisting of SMARTS observation at each time point, for all vehicles following the instructions given above.

Hi, Thanks for your detailed reply! However, what we are concerned about is actually not the future trajectory but the map information (wapoint_paths or road_waypoint) for eachobs.neighborhood vehicles, as it is common to be able to acquire the overall scene map information in real-life hypothesis; While in the current hidden dataset, almost all map information is missed for neighbor vehicles. Thanks again for your kindly help and reply!

MCZhi commented 2 years ago

The reason we need waypoint_paths for neighborhood vehicles is that we want to build a vectorized map representation of the driving scene. It would also be nice if the organizer could provide map information (map file) in the offline dataset. We genuinely hope the organizer could consider it.

Adaickalavan commented 2 years ago

Unfortunately, I am afraid we are unable to accomodate your request as far as the NeurIPS2022 competition is concerned, with respect to the vectorized map representation.