raoyongming / DenseCLIP

[CVPR 2022] DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting
505 stars 38 forks source link

code about Pre-model prompting #11

Closed Ahnsun closed 2 years ago

Ahnsun commented 2 years ago

Hi, the paper proposed two context prompting methods which are pre-model and post-model prompting respectively。 But I only find the post-model methed from the source code. Could you please provide the pre-model's code? That will help understand a lot. Thanks!

raoyongming commented 2 years ago

Since post-model prompting is much more efficient and effective than pre-model prompting, we use the post-model strategy as the default setting in our final models. Since it is a bit difficult to make the code compatible with both two versions, I can send you our original implementation privately. Could please you give me your email address?

Ahnsun commented 2 years ago

Sure,yuen@hust.edu.cn

kexibudongshi commented 1 year ago

Could please you share a copy of the pre-model prompting code with me? I am interseted in it. This is my email 1844235028@qq.com