jac99 / MinkLoc3D

MinkLoc3D: Point Cloud Based Large-Scale Place Recognition
MIT License
130 stars 19 forks source link

Config file for VladNet #4

Closed JBKnights closed 3 years ago

JBKnights commented 3 years ago

Hello,

I was interested in training with VladNet using the model provided in your repository for comparison; however, there doesn't seem to be an available config file. Would it be possible for you to provide one?

Thanks!

jac99 commented 3 years ago

Do you mean, the model that is the same as my MinkLoc3D model, but instead of the last GeM (generalized mean) pooling layer, has NetVLAD aggregation layer? Simple change model name (model parameter) from MinkFPN_Gem to MinkFPN_NetVlad in model config file (minkloc3d.txt in models folder).

[MODEL]
model = MinkFPN_NetVlad
mink_quantization_size = 0.01
planes = 32,64,64
layers = 1,1,1
num_top_down = 1
conv0_kernel_size = 5
feature_size = 256

Probably you'll also need to decrease maximum batch size (batch_size_limit parameter ) in config file (e.g. in config_baseline.txt in config folder). As NetVLAD layer has a larger number of parameters and you can run out of GPU memory.

jac99 commented 3 years ago

Other aggregation/global pooling layers that are supported, by simply changing model name in model config file are:

        if model == 'MinkFPN_Max':
            assert self.feature_size == self.output_dim, 'output_dim must be the same as feature_size'
            self.pooling = pooling.MAC()
        elif model == 'MinkFPN_GeM':
            assert self.feature_size == self.output_dim, 'output_dim must be the same as feature_size'
            self.pooling = pooling.GeM()
        elif model == 'MinkFPN_NetVlad':
            self.pooling = MinkNetVladWrapper(feature_size=self.feature_size, output_dim=self.output_dim,
                                              cluster_size=64, gating=False)
        elif model == 'MinkFPN_NetVlad_CG':
            self.pooling = MinkNetVladWrapper(feature_size=self.feature_size, output_dim=self.output_dim,
                                              cluster_size=64, gating=True)
jac99 commented 3 years ago

Closing