-
Hi, I need the groundtruth affordance maps for my project and I was wondering how you obtained them. I looked into the code a lot but I could not find it. In train_unet.py, there is a section called G…
-
Thank you for sharing this work. Can you tell me how to get the GT label? Or which tool did you use to get the GT segmentation? Thank you so much for your answer.
-
-
I did a small test where camera mount in rover ([without the front hood on a car](https://github.com/rpng/open_vins/issues/379)), but the results were worse than previous attempts.
launch_d455.exam…
-
Hi Mathias,
Do you provide the groundtruth trajectories of the left event camera in the dataset? I noticed that an RTK was deployed right there.
Also, will the GT trajectory be available in the …
-
Currently our groundtruth creation pipeline maps an input bagfile (part of this process is removing low baseline images) and then registers it against an existing base surf map. The groundtruth poses …
-
# Description
Given the case, that groundtruth has been created once upon a time and the original(=referenced) image has been scanned again with different parameters (resolution, channels, size, an…
-
I'm getting this error when I try to run main.py, can anybody please help out?
_### roundtruth_rotation = raw_groundtruth[1][0].reshape((3, 3)).T # opposite rotation of the first frame
ValueError…
-
I was wondering how to evaluate with the `LAB_Survey_1.bag` or if there was a already pre-processed groundtruth trajectory for this. Is this `/natnet_ros/rigid_bodies/Phasma/pose` the marker frame of …
-
I want to compare the estimated pose with the ground truth pose using ov_eval. The dataset that I am using is the EuRoC dataset. I am recording a rosbag of the estimated pose data (/ov_msckf/poseimu) …