HARPLab / DReyeVR

VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research
https://arxiv.org/abs/2201.01931
MIT License
149 stars 40 forks source link

Display sim-time in replay; synchronization discussion #110

Closed RStorm98 closed 1 year ago

RStorm98 commented 1 year ago

Hi again,

We now have a number of recordings and some data, that we wish to key together. Attached is an example of one of our logs, where we document brake input, throttle input and time, where the brake input is markted with red, thr throttle with blue and game time is marked with pink. Our issue is synchronizing the values with the replay, we were wondering if it was possible to display the game time in the replay the same way as framerate is shown, when using start_replaying.py from the API examples.

344295439_1032973264332529_3083937364701682661_n Here is a picture of an example csv data file, where each row is a game tick with a fixed frame rate of 20

343354361_568687988738644_308807586760006450_n This is a screenshot of a game recorded by using run_experiment.py, in replay mode.

GustavoSilvera commented 1 year ago

This synchronization you mention is the whole idea behind our synchronized replay. By default, Carla performs some interpolation of the replay values to match the replay FPS, but if you want exact 1:1 correspondence between the recording (what you have) and the simulator (during replay) such as for screen shots or otherwise, we recommend you look into this.

This is what we use to get exactly one screenshot per recording timestep (without interpolation) so we are guaranteed that every frame matches up with the recording events.

Note that this is specifically for the recording files, which might slightly differ from a PythonAPI sensor logging because of PythonAPI synchronization delays, so we recommend you get the data you care about by parsing your recording file (in txt form) from --show_recorder_file_info.py -a -f ...

RStorm98 commented 1 year ago

It seems we misunderstood RecordFrames, we've found the screenshots from our replays and will attempt to link them together with our logs.

More specifically, we will put our data into an excel file and use the rows in excel to match the frames with the values in our DReyeVR_logging.py from the pythonAPI, and hopefully still find some correlation between the frames and our data rows.

Regretably, we haven't read about the parsing function sooner and we have already recorded all our data now, but the tme-stamp maches between our file and the files that comes out of the show_recorder_file, and we can use that to link the data together with the frame captures!

Thanks you so much for that tip!

For a faster data analyzation for future projects, we would love to request a display of game time in the replay feature, as that will give us the ability to see a specific data value in the replay mode from different directions :)

GustavoSilvera commented 1 year ago

No problem! The parser is a somewhat recent addition and I agree should be integrated into DReyeVR proper, we'll try to get this in the documentation soon.

What do you mean by game time in the replay? do you mean as a world-element that is also being replayed (like a world-clock in the vehicle dash?) or as a HUD metadata string? The HUD-metadata string should be doable just as a postprocessing effect on the images post-hoc.

Also when you mention "different directions" do you mean different camera directions? This is already supported in our replay system with various configurations! (driver's POV, top-down (birds-eye-view), vehicle front, third-person).

RStorm98 commented 1 year ago

Yes, a world-clock is a good way to put it! We used "timestamp_stream" data from the DReyeVR_logging.py to synchronize the API data with the log file from show_recorder_file_info.py, so we think of that value when we write "time in the replay". A time display in the simulator itself would make it easier to locate points of interest in the trial runs, by looking at the times in the recording files. Yes "different directions" is our crooked way of saying different camera directions (sorry for bad english haha). We meant to say that a display of a world-clock ("timestamp_stream") could be a way of finding a specific frame where throttle is 0.2367236 for example, by maching the time value and then seeing that frame in the replay-mode from different directions.

GustavoSilvera commented 1 year ago

Sounds good. Currently the synchronization comes from the (screenshot) frames that are associated with the recording on a per data-point level. For instance, if we recorded 100 timesteps in the recording (with no. 32 having the throttle value of 0.2367236, for example) then you should be able to find the corresponding screenshot pretty easily (its the 32'nd one, so labeled under Unreal/CarlaUE4/FrameCap/TIME_DATE_OF_REPLAY/tick_s0_p0_00031.jpg since we zero-index). This screenshot naming scheme comes from the same idea as "tick_s{shader_idx}_p{camera_pos_idx}_{data_id}.jpg" where shader_idx represents the various shaders we can render (RGB, depth, semantic segmentation), and camera_pos_idx represents the various views that are available for rendering (first-person, third-person, birds-eye, front), and the data_id corresponds to the index of this data point (zero-indexed).

Currently, we have a way to optionally render all the shaders and camera views by toggling the flags here, which should hopefully be useful for you.

RStorm98 commented 1 year ago

Using the frame captures has definitely been helpful. We have currently acquired all the data we need, but we'll definitely keep these features in mind for the future. Thank you for the help, and sorry for the late reply.