argoverse / av2-api

Argoverse 2: Next generation datasets for self-driving perception and forecasting.
https://argoverse.github.io/user-guide/
MIT License
319 stars 76 forks source link

motion forecasting corresponding lidar data #4

Closed BaiLiping closed 2 years ago

BaiLiping commented 2 years ago

hi, if i want to do motion forecasting from lidar data, can i find the corresponding lidar raw data for each frame?

James-Hays commented 2 years ago

Hi!

Unfortunately, no. The lidar data for 250,000 motion forecasting scenarios spanning 763 hours would be about 25 TB with moderate compression.

However, the lidar dataset is 166 hours total, roughly 1/5th the time span of the motion forecasting dataset. It doesn't have any notion of ground truth trajectories, but you could run your own tracker to create trajectories.

BaiLiping commented 2 years ago

thanks for your reply. I think i will use that 166h of data then. But how do i do the evaluation on that 166h of forcasting based on the raw lidar data?

James-Hays commented 2 years ago

If you're trying to forecast objects, you'd need a tracker to give you some notion of ground truth tracks to evaluate against. We don't have "official" tracks to provide. Although if the community is interested in such tracks that would be good to know.

Alternatively, you could try to forecast the point cloud itself (e.g. SPF2 https://www.xinshuoweng.com/papers/SPF2/proceeding.pdf ).

James-Hays commented 2 years ago

I should add that the Argoverse 2 Sensor Dataset has all sensors as well as ground truth tracks (for the training and validation set), but it is only 4.2 hours of data. It is similar in scale to nuScenes or Waymo open's sensor datasets.

BaiLiping commented 2 years ago

Thanks a lot for the pointers. I would look into those.