JialeCao001 / PSTR

PSTR: End-to-End One-Step Person Search With Transformers (CVPR2022)
https://arxiv.org/abs/2204.03340
Apache License 2.0
43 stars 12 forks source link

Part Attention Block #4

Open Suvashsharma opened 2 years ago

Suvashsharma commented 2 years ago

Hi there, I wanted to understand more about part attention block about how does it actually attends to the parts of the object in the image. Moreover, I could not spot the specific code block written to implement part attention block/layer? Could you please help me in spotting it in the repo and understanding a bit more about part attention?

NguyenVanThanhHust commented 1 year ago

@Suvashsharma I think it is in this file https://github.com/JialeCao001/PSTR/blob/main/mmdet/models/dense_heads/pstr_head.py