princeton-vl / DROID-SLAM

BSD 3-Clause "New" or "Revised" License
1.65k stars 272 forks source link

Error while running inference on custom dataset #95

Closed jorgepradoh closed 1 year ago

jorgepradoh commented 1 year ago

Hello, I am trying to run DROID-SLAM on a custom dataset, however after a certain number of iterations (between 2-3k), i get the error

/DROID-SLAM/droid_slam/droid_frontend.py", line 69, in __update
    self.video.poses[self.t1] = self.video.poses[self.t1-1]
IndexError: index 512 is out of bounds for dimension 0 with size 512

I am running this through a bash script on a server with the following parser arguments

python demo.py --imagedir=/bagfiles/mapping_left --calib=calib/S3LI.txt --reconstruction_path=/results/mapping --disable_vis

The dataset is a sequence of 688x512 RGB images, I am using the weights file provided in the google drive download link from the demo section. The GPU being used is an NVIDIA Tesla V100.

Previous runs of inference on the example demos execute without problem. Does this mean the algorithm cannot completely perform slam on this dataset? Or is it something in the code that I should modify?

The respective output log is attached, thanks in advance.

output_d_mapping_341.log

AndrewTKent commented 1 year ago

@jorgepradoh I'm running into this issue a lot too, not sure what its caused by -

Edit 1: It looks like it initializes a set of vectors storing the state of the trajectory and etc. Maybe specifying a longer buffer will suffice? I'll respond in this comment with the answer.

Edit 2: Yes thats the solution, i'll comment back here if somewhere down the line this turns out not to be the solution.

Edit 3: It is the solution, the only draw back is during the global bundle adjustment if you are using a larger buffer it'll require a GPU with more memory. I've been using NVIDIA A10's (24GB) and i've been comfortably processing videos with 13K frames with a buffer of 1100.

kwea123 commented 11 months ago

tldr, the solution is to specify e.g. --buffer 1024

notabigfish commented 9 months ago

@jorgepradoh I'm running into this issue a lot too, not sure what its caused by -

Edit 1: It looks like it initializes a set of vectors storing the state of the trajectory and etc. Maybe specifying a longer buffer will suffice? I'll respond in this comment with the answer.

Edit 2: Yes thats the solution, i'll comment back here if somewhere down the line this turns out not to be the solution.

Edit 3: It is the solution, the only draw back is during the global bundle adjustment if you are using a larger buffer it'll require a GPU with more memory. I've been using NVIDIA A10's (24GB) and i've been comfortably processing videos with 13K frames with a buffer of 1100.

thanks for your suggestion. My dataset has 4541 frames, and I set buffer to 1024. The "out of bounds" error shows up again. Does this mean that buffer size should be larger than number of frames?