MarvinChung / Orbeez-SLAM

GNU General Public License v3.0
260 stars 28 forks source link

Error caused by GPU memory #22

Open Wingerllyyy opened 5 months ago

Wingerllyyy commented 5 months ago

I run the TUM whith commad "./build/mono_tum Vocabulary/ORBvoc.txt configs/Monocular/TUM/freiburg3_office.yaml ./rgbd_dataset_freiburg3_long_office_household/ "on docker with GPU GTX1660 and I get the error I image

Copliot give the following advice: "Reduce the n_neurons parameter in FullyFusedMLP model configuration. This will decrease the memory requirement of the model. Alternatively, use CutlassMLP, which might offer better compatibility but could be slower."

I woner if these two can work. And I noticed the gpu memory usage is about 6GB. But the GPU memory usage in the paper and other people is about 9GB. Can I work with my GPU or I have to change one.