lmb-freiburg / deeptam

DeepTAM: Deep Tracking and Mapping https://lmb.informatik.uni-freiburg.de/people/zhouh/deeptam/
GNU General Public License v3.0
233 stars 42 forks source link

Can you please provide the data you trained your models on? #5

Open rishimadhok opened 5 years ago

rishimadhok commented 5 years ago

Hi,

Thank you for releasing your code. I was able to reproduce the results shown on your paper.

For the training of the tracking part, you mentioned in your paper that

We train on image pairs from the SUN3D dataset and the SUNCG dataset. For SUN3D we sample image pairs with a baseline of up to 40cm. For SUNCG we generate images with normally distributed baselines with standard deviation 15cm and rotation angles with standard deviation 0.15 radians. When sampling an image pair we reject samples with an image overlap of less than 50%. For keyframe depth maps DK, we use the ground truth depth from the datasets during training.

Can you please share the exact data on which you trained your tracking model? or maybe a script to download the data in the required format to train it.

Similarly for the mapping module.

Thank you so much in advance!

ghost commented 3 years ago

Hello @rishimadhok,

Were you able to get/create the data for training? Also where can one find the training code for the tracking part?

Thanks!