Closed wenzhengzeng closed 2 years ago
It inherits from Conditional DETR, which you can refer to for more details. It was named as conditional operation in the original paper and helped the models to get better results. We follow the settings in the Conditional DETR. However, we found it provided no gains in our DAB-Deformable-DETR, hence removing it.
Thanks for your reply. I found that in conditional detr there is no such operation but there is in dab-detr. The specific position is in DAB-DETR/models/DAB_DETR/transformer.py, line154. Can I assume that the removal of this operation will have little effect on the results?
In our experiments, it is helpful to DAB-DETR, but harmful to DAB-Deformable-DETR.
Thanks for your reply.
Thanks for your great work. I notice a difference between dab-detr and conditional detr where there is a MLP defined as 'self.query_scale' before each transformer encoder layer. Does this operation have a description in the paper or other reference paper to explain its effect?