microsoft / DynamicHead

MIT License
624 stars 60 forks source link

Missing the code of scale-aware and task-aware block? #30

Open kriskrisliu opened 2 years ago

kriskrisliu commented 2 years ago

Thank you for sharing the code! I'm seeking for details about scale-aware, spatial-aware and task-aware blocks in the source code. However, as i can see in the source code "dyhead/dyhead.py", there only exists the implementation of spatial-aware block. Is this a mistake or my misunderstanding?

sevenandseven commented 2 years ago

i also find the scale-aware,can you tell me where is it? thank you.

sourabh-patil commented 2 years ago

@kriskrisliu In dyhead.py from line 79-82, you can observe all three attentions being used. I think the difference is that they are first applying spatial attention using deformable convolutions and then level attention using hard sigmoid followed by task attention through DyReLU. But it is confusing to understand because they are doing this for each level in the pyramid (in for loop).

zqx951102 commented 2 years ago

I can not find it too!

zqx951102 commented 2 years ago

尺度的实现能找到,空间的实现 就是用了可变形卷积得到offset,但是 任务级别的实现没找到代码啊!