IDEA-Research / DAB-DETR

[ICLR 2022] Official implementation of the paper "DAB-DETR: Dynamic Anchor Boxes are Better Queries for DETR"
Apache License 2.0
501 stars 86 forks source link

'self.query_scale' before each transformer encoder layer #39

Closed wenzhengzeng closed 2 years ago

wenzhengzeng commented 2 years ago

Thanks for your great work. I notice a difference between dab-detr and conditional detr where there is a MLP defined as 'self.query_scale' before each transformer encoder layer. Does this operation have a description in the paper or other reference paper to explain its effect?

SlongLiu commented 2 years ago

It inherits from Conditional DETR, which you can refer to for more details. It was named as conditional operation in the original paper and helped the models to get better results. We follow the settings in the Conditional DETR. However, we found it provided no gains in our DAB-Deformable-DETR, hence removing it.

wenzhengzeng commented 2 years ago

Thanks for your reply. I found that in conditional detr there is no such operation but there is in dab-detr. The specific position is in DAB-DETR/models/DAB_DETR/transformer.py, line154. Can I assume that the removal of this operation will have little effect on the results?

SlongLiu commented 2 years ago

In our experiments, it is helpful to DAB-DETR, but harmful to DAB-Deformable-DETR.

wenzhengzeng commented 2 years ago

Thanks for your reply.