umautobots / bidirection-trajectory-predicter

The code for Bi-directional Trajectory Prediction (BiTraP).
Other
78 stars 23 forks source link

Question - Video Prediction #6

Closed M-Colley closed 3 years ago

M-Colley commented 3 years ago

Hey, thanks for sharing your work!

Would it be possible to implement the prediction and visualizing it in videos? Are there any intentions to provide such functionality or some hints how to do this?

Kind regards!

MoonBlvd commented 3 years ago

Hi @M-Colley , what dataset what type of visualization are you talking about? For FPV such as JAAD and PIE, you can just plot the bounding boxes using opencv rectangle or any other visualization tool you want to use. For ETH-UCY dataset, you may need camera parameters to map the trajectory waypoints (in meters) to the image pixels.

M-Colley commented 3 years ago

Thank you for your quick response!

Oh I might have misunderstood something.

I thought I could use any video (in FPV) but after rereading I believe I would first need to detect the pedestrian location (and also the past trajectory)?.

So I dont want to use any training labeled data but "in the wild" data. Is this possible?

MoonBlvd commented 3 years ago

@M-Colley Unfortunately our method (and all other trajectory prediction methods) takes past bounding box trajectories as input for FPV data so you need to implement object detection and tracking first on your "in the wild" data.

M-Colley commented 3 years ago

Okay thank you, you can close the issue :) and great work!