georghess / neurad-studio

[CVPR2024] NeuRAD: Neural Rendering for Autonomous Driving
https://research.zenseact.com/publications/neurad/
Apache License 2.0
329 stars 23 forks source link

Rolling Shutter Parameters #27

Open ntseng450 opened 4 months ago

ntseng450 commented 4 months ago

Hello, I'm curious about how the rolling shutter parameters used in the experiments were obtained.

For the pandaset, the rolling shutter is set to (-0.03, 0.01); however, it isn't too clear how this was estimated from the paper. Since the cameras individual timestamps aren't available in the data, how was the shutter time manually approximated? Also, were none of the other datasets high-speed enough to benefit from rolling shutter modeling?

georghess commented 4 months ago

Hi,

As you say, the parameters are not available in the raw data, hence we estimated them manually. The process was something like:

  1. Find scenes where we drive by something close by that is standing straight up. For example, a pole on the side of the road.
  2. Check point cloud data to check distance from side camera to object in question.
  3. Look at pose data to estimate velocity when we drive by.
  4. Get a rough guess for rolling shutter parameters.
  5. Do some parameter sweeps to find optimal value. Then we use the same parameters for all cameras, which of course might not be correct, but works good enough in practice. Further, we've seen that exposure parameters seems to sometimes differ during some scenes (like when sun is shining straight into camera and then is obstructed), so this process is prone to errors.

I think if you are not driving by things at a high velocity, then the rolling shutter compensation usually does not give big benefits to any metrics. For the lidar, however, it is important to capture these effects for proper geometry supervision.