xaldyz / dataflow-orbslam

Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities
Other
94 stars 12 forks source link

Did not reach the speed mentioned in the paper to 30 frames per second #3

Closed zlgybz closed 3 years ago

zlgybz commented 4 years ago

Hi, Your job is great! I tried your project on TX2. The 00 sequence of the KITTI data set is used. The average time for each frame is 0.117s. Did not reach the speed mentioned in the paper to 30 frames per second。And the original ORBSLAM2 took an average time of 0.107s. I used the tool to check the GPU usage when running your project. But GPU usage rate is very low. Did I miss any steps? Why does this happen?

xaldyz commented 4 years ago

Hello, sorry for the late reply, and thanks for the compliment!

Yeah, if GPU is not used, that means there is some tweak to be made on the CMake configuration. If you look through the option by using ccmake command, you will see there are 2 options which need to be modified:

The first one enables the GPU acceleration, which is good enough for the KITTI dataset, but to further push performance, the second flag controls the possibility to pipeline multiple frames. By combining both of them, you should be able to achieve the 30 FPS mark. Remember to run jetson_clocks, otherwise you will have very bizarre results due the DVFS technology.

I'll keep this open for a week or two, if nothing newer comes up then I'll proceed to close it

zlgybz commented 4 years ago

Thanks for your reply. But after I changed USE_CUSTOM_VX to TRUE, the error message occurred after running: Error calling pthread_setaffinity_np:22 Error calling pthread_setaffinity_np:22 Segmentation fault(core dumped)

xaldyz commented 4 years ago

About the pthread_setaffinity_np error, this is because software assumes 6 cores available, which you can obtain by running the command sudo nvpmodel -m 0 I don't remember right away the source of the segmentation fault, but I recall to have seen that. Since due the COVID situation, I have difficulty to reach the board where the code is, so I cannot directly inspect this. Can you give me more insights on where it is happening? I believe it could be after the initialization in src/System.cc, or in the constructor of src/ORBExtractor.cc.

zlgybz commented 4 years ago

Thanks for your reply. The terminal output is as follows: Camera Parameters:

ORB Extractor Parameters:


Start processing sequence ... Images in the sequence: 1022

Segmentation fault (core dumped)

xaldyz commented 4 years ago

@zlgybz sorry for the late reply, I kept thinking about what kind of problem you could have since I downloaded from this repo into a clean TX2 environment and I got zero issues. Then it hit me: you mentioned you got the 00 sequence, so I presume you run it with the "KITTI00-02.yaml" settings file. This file missed two important parameters which are needed for the GPU part to function correctly, and an example is found in the KITTI04-12.yaml setting file. Try to set those two parameters with the correct width and height of the images, and everything should run accordingly

seomk97 commented 3 years ago

Hi, Can you give more detail about the metric "The percentage of reconstructed map"? Is there any ground truth 3D feature point map in KITTI? Thank you for great job

xaldyz commented 3 years ago

Hi, Can you give more detail about the metric "The percentage of reconstructed map"? Is there any ground truth 3D feature point map in KITTI? Thank you for great job

Hi @seomk97 , the percentage of reconstructed map is exactly how it sound: how much of the map the algorithm was able to reconstruct using the specific board, in our case the Jetson TX2. We noted that, even if average speed was good, some computational spikes took a little bit more time (for example when there is a loop check). This happened to make the system lose the localisation point, so only partial map is available since the system is not able to generate more points. The losing point was quite consistent across different runs, so we decided to put the metric available for future works. About KITTI, there is the 3D trajectory available at their site with the code to calculate the difference with the trajectory you provide!

I ask you to open a new issue if the question is not related to the original one, to avoid divergence 😉

seomk97 commented 3 years ago

@xaldyz Thanks for your reply! I understood your explanation partially. I couldn't find ground truth 3D map in KITTI datasets. How can you measure the percentage of reconstructed map of your algorithm exactly? I mean where is 100% constructed map? (I should have opened a new issue, really sorry about that)