zhihou7 / BatchFormer

CVPR2022, BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning, https://arxiv.org/abs/2203.01522
242 stars 20 forks source link

Questions about parameter settings #25

Open cbn3 opened 10 months ago

cbn3 commented 10 months ago

I'm sorry to bother you. I found that bf=0 sharebf=1 in the deformable detr batchformerv2 main.py, while bf=1 sharebf=0 in the command line in the readme file. If I want to use batchformerv2, which one is correct? Do insertidx not need to be set in main.py? If it needs to be set, what is the setting?

截屏2023-09-06 15 24 00 截屏2023-09-06 15 23 15
zhihou7 commented 10 months ago

Hi, Thanks for your interest and sorry for the confusing argument.

--share_bf 0 --bf 1 --insert_idx 0'' indicates we use batchformer (bf 1), we do not share the batchformer along different layers, and we insert the batchformer in the first layer only (insert_idx 0). Since we only insert batchformer module in the first layer, it is meaningless to setshare_bf''.

Best,

cbn3 commented 10 months ago

Thank you for your prompt reply. I will give it a try!

cbn3 commented 10 months ago

Hi, Thanks for your interest and sorry for the confusing argument.

--share_bf 0 --bf 1 --insert_idx 0'' indicates we use batchformer (bf 1), we do not share the batchformer along different layers, and we insert the batchformer in the first layer only (insert_idx 0). Since we only insert batchformer module in the first layer, it is meaningless to setshare_bf''.

Best,

So should sharebf be set to 0?

zhihou7 commented 7 months ago

Sorry for getting you late. Sometimes, I did not receive the email notification for this message. share_bf means you share bf among different layers. If you just insert a single layer, share_bf is meaningless.