fabianschenk / REVO

Robust Edge-based Visual Odometry (REVO)
148 stars 44 forks source link

There is a problem with running large datasets #8

Open RuiYangQuan opened 2 years ago

RuiYangQuan commented 2 years ago

When the data set of more than 1500 frames was used to verify the algorithm, it was found that it could not fully map all scenes of the data set. Is there a buffer mechanism that makes the algorithm stop drawing after exceeding a certain limit value? Please give me some suggestions, thank you !

fabianschenk commented 2 years ago

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames. Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

RuiYangQuan commented 2 years ago

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames. Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

Thanks for your reply, you are right, there is no problem with real-time operation through sensors, but this problem is often encountered in the face of datasets.

fabianschenk commented 2 years ago

Hi @RuiYangQuan ,

Could you check the dataset config you're using. There's a parameter that limits the number of images to read, e.g. dataset_tum1.yaml: https://github.com/fabianschenk/REVO/blob/eb949c0fdbcdf0be09a38464eb0592a90803947f/config/dataset_tum1.yaml#L44