Closed kylelindgren closed 5 years ago
Thank you for your interest!
It's true that we forgot to mention intrinsic parameters of the cameras on the website (we will correct this as soon as we can). The answer is to be found in the paper. Since we are using a simulator, cameras are perfectly calibrated by default (principal point perfectly in the middle of the image, square pixels, no skew, no distortions,...). As the first step of every vision algorithm consists in correcting the pictures, we didn't see the point in simulating arbitrary imperfections. Additionally, it allows to compare different algorithms performances while keeping potential calibration weaknesses out of the equation.
Concerning the IMU, we provide an initial bias estimate for each trajectory and each sensor in the attributes of the corresponding HDF5 group. We however don't provide noise parameters for now. As we are open to improvement suggestions, we may add them in the future if there is a confirmed need for that specific data.
Thank you for the prompt response.
For the camera parameters, I was able to get VINS-Fusion and SVO 2.0 to produce reasonable estimates using distortion parameters set to zero, principal point in the center of the image, and focal lengths equal to half the image width (since FOV is 90 deg).
And for the IMU parameters, conservative estimates seem sufficient with the above methods. However, exact bias and random walk values would be excellent for precise evaluations. When collecting the IMU data with AirSim, did you use the ground truth kinematics and apply simulated noise?
Thanks again!
I'm glad to hear that it's working without trouble on your side! I take note of the idea. We will add the noise parameters to the dataset in a future update.
Our IMU measurements are simulated based on the ground truth kinematics recorded from Airsim and on the noise model given in the paper. We did this because we couldn't get Airsim to make measurements at precise time intervals. There were important variations in the log timestamp intervals (sometimes it could go as high as dozens of milliseconds). To get ground truth measurements recorded at exactly 100Hz, we had to interpolate and resample the records obtained from Airsim. Since IMU measurements have to be synchronized with the ground truth kinematics, we were left with no choice but resimulating it with our own model. This however gave us the advantage of letting us choose our own noise parameters.
Ah, interpolating the ground truth measurements and simulating noise seems like a reasonable workaround.
Is it possible to share the noise parameters here? Did you follow AirSim's target sensor (InvenSense MPU 6000 IMU) with noise parameters listed in their source code?
I see now the toy example shared in issue #3. Just to clarify, the IMU noise parameters were set randomly for each trajectory?
Yes, they were. I'm sorry if this was not clear before. That's why, if we wanted to provide the noise parameters, we would need to give them for each individual trajectory. The idea behind this choice (random+unknown parameters) was to promote the development of visual-inertial odometry algorithms which can deal with this unknown.
Since we are considering to extend the dataset with IMU data providing the noise parameters, we are performing an informal survey on the way to do that. Do we need to simulated IMU data with the same noise parameters for each trajectory or is it more interesting to make them change between each trajectory? If you have an opinion on that, let us know it, we will add it to the balance when making the final decision.
Thank you for confirming.
With regard to future choices with the IMU data, I think it would be useful to have the ground truth provided in addition to perhaps a script which can generate IMU data with user-defined noise parameters. And then robustness could be tested across a known spectrum.
It looks like ground truth IMU data can be recovered already though given the ground truth angular vel, acceleration, and attitude (to get accel from world to body frame) data that's included, so I will take that route for now.
Hope that's helpful and thank you again!
Thank you for creating and releasing this dataset to the public.
I am looking into using it for visual-inertial odometry research and was hoping you could provide the camera distortion parameters and the accelerometer and gyroscope bias and random walk parameters if you have them.