microsoft / Relation-Aware-Global-Attention-Networks

We design an effective Relation-Aware Global Attention (RGA) module for CNNs to globally infer the attention.
MIT License
337 stars 65 forks source link

better loading of pretrain model #11

Open haruishi43 opened 3 years ago

haruishi43 commented 3 years ago
ghost commented 3 years ago

CLA assistant check
All CLA requirements met.

haruishi43 commented 3 years ago

i've tried with and without this revision, but I couldn't recreate the same results as in the paper. this revision didn't change the results too much I'll post some results on #9 since there's a lot of attention on results there

haruishi43 commented 3 years ago

Couldn't recreate the results:

Evaluated with "feat_" features and "cosine" metric:
Mean AP: 33.2%
CMC Scores
  top-1          30.7%
  top-5          56.6%
  top-10         69.6%
Evaluated with "feat" features and "cosine" metric:
Mean AP: 21.4%
CMC Scores
  top-1          17.7%
  top-5          36.8%
  top-10         52.3%