iamhankai / attribute-aware-attention

[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
https://arxiv.org/abs/1901.00392
156 stars 30 forks source link

How to include in our Project? #21

Open Rajasekhar06 opened 3 years ago

Rajasekhar06 commented 3 years ago

Hello I have got intrigued on seeing this paper: I'm trying to make a object tracker I would like to use your work along with detection using transformers or with Re3 in order to reduce errors due to occlusion correct me if I am wrong how can your work be injected into them I'm trying to understand but with your support it will be easier to use your work while DETR is transformer based object detection and Re3 is LSTM based generic Object tracking algorithm both of them depend on attention mechanism all I need to understand is A3M attention in them, I know I am asking for a lot but help me out.

Thanks, Rajashekar

iamhankai commented 3 years ago

Thanks for your attention. The input of A3M attention module is feature maps from CNN, you may change the code to input features of transformer or LSTM, refer to https://github.com/iamhankai/attribute-aware-attention/blob/4b4cf873d6e398f1e64891dbc34ccb8fbd891f30/cub_demo.py#L109