Closed DunFenTiao closed 3 years ago
Offset should be calculate between the ground truth and you result. check this page https://drive.segwayrobotics.com/benchmark/vi_odometry_slam
thks! while the page has "Whitelabel Error Page" where I can't access, are there other ways to get it?
something wrong with the router of the the websit, you can find this page by this way :
https://drive.segwayrobotics.com/ then - > Benchmark -> VIO/SLAM benchmark
better read the page , and if you still could not see the page, you can follow these steps:
evaluation scripts are here : https://github.com/segwayrobotics/segway_DRIVE_benchmark/tree/master/scripts
install dependency
`
sudo pip install ceres
sudo apt-get install libboost-all-dev
pip install opencv-python
sudo apt-get install python-tf
sudo apt-get install autoconf
sudo apt-get install python-tk
sudo apt-get install libv4l-dev
`
3, run scripts like :
python ./scripts/evaluate_ate.py \ $DATA_DIR/your result.csv \ $DATA_DIR/camera_groundtruth.csv \ --plot $DATA_DIR/result.svg \ --save result_ate.txt \ --verbose
python ./scripts/evaluate_rpe.py \ $DATA_DIR/your result.csv \ $DATA_DIR/camera_groundtruth.csv \ --plot $DATA_DIR/result.svg \ --fixed_delta --delta 2 \ --delta_unit s \ --save result_rpe.txt \ --verbose
Thanks a lot! it is very detailed, I could evaluate my result now. what's more, would a calibrated extrinsic between camera and Lidar be provided?
the ground truth is the value after process the extrinsic,so forget about Lidar, there only one camera.
thks a lot, i got it~
I want to evaluate my method on your dataset, while i can't find the calibration data (include the time offset and the position offset between the ground truth and the dataset) on the https://drive.segwayrobotics.com/benchmark page, could you help me ?