MarSaKi / VLN-BEVBert

[ICCV 2023} Official repo of "BEVBert: Multimodal Map Pre-training for Language-guided Navigation"
184 stars 5 forks source link

FLOPs calculation #1

Closed lu-renjie closed 1 year ago

lu-renjie commented 1 year ago

How did you calculate the FLOPs reported in your paper?

MarSaKi commented 1 year ago

I used “thop” package to calculate one forward computation of short-term transformer, below is a demo code: ''' from thop import profile from transformers import PretrainedConfig config = PretrainedConfig.from_pretrained('bert_config/bert-base-uncased') config.bev_dim = BEV_DIM config.hidden_size = 768 config.num_x_layers = 4 config.use_lang2visn_attn = False model = LocalBEVEncoder(config)

txt_emebds = torch.zeros([1,80,768]) txt_masks = torch.ones([1,80]).bool() bev_fts = torch.zeros([1, BEV_DIMBEV_DIM, 768]) bev_pos_fts = torch.zeros([1, BEV_DIMBEV_DIM, 10]) bev_masks = torch.ones([1, BEV_DIMBEV_DIM]).bool() bev_nav_masks = torch.ones([1, BEV_DIMBEV_DIM]).bool()

flops, params = profile(model, inputs=(txt_emebds,txt_masks,bev_fts, bev_pos_fts, bev_masks, bev_nav_masks,None, None, )) print('FLOPs = ' + str(flops/10003) + 'G') print('Params = ' + str(params/10002) + 'M') '''

lu-renjie commented 1 year ago

Ok, thank you.

MarSaKi commented 1 year ago

已收到,谢谢。

Please consider giving me a star. Thanks :D