HorizonRobotics / Sparse4D

MIT License
326 stars 31 forks source link

About DecoupledAttention and anchor/temp_anchor embedding #15

Closed XXXVincent closed 9 months ago

XXXVincent commented 9 months ago

Under the current released config(sparse4dv3_temporal_r50_1x8_bs6_256x704.py), _decoupleattn = True, which leads to when apply _self.graphmodel, the _anchorembed and _temp_anchorembed will never been used. image

The multi-head-attention only uses instance_feature and temp_instance_feature as inputs.

It seems that the explicit historical anchors is only used to get the __temp_anchorembed however _temp_anchorembed is not used in latter operations, only historical instance feature is used.

May I ask why not use _anchor_embed/temp_anchorembed, does this setting leads to performance drop? Is it a bad idea to collect and aggregate features nearby projected historical detections?

linxuewu commented 9 months ago

query = torch.cat([query, query_pos], dim=-1), concatenates the anchor_embed with the instance_feature. The anchor_embed has been used.

XXXVincent commented 9 months ago

Yes indeed! Thanks! Nice work btw.