Jianlong-Fu / Recurrent-Attention-CNN

225 stars 45 forks source link

AttentionCrop Layer: Where can we find it? #5

Closed ParitoshParmar closed 4 years ago

ParitoshParmar commented 7 years ago

Hi Jianlong,

You have used AttentionCrop layer, but, I think, you have not released it. Can you please make it available?

Thank you

Geeshang commented 7 years ago

Same here, this Caffe code can not run without the "AttentionCrop", which I think to be one of the core layers of the paper, where is it?

Geeshang commented 7 years ago

The new download link includes the "AttentionCrop" layer. https://1drv.ms/u/s!Ak3_TuLyhThpkxifVPt-w8e-axc5

tracycw commented 7 years ago

how about the rank-loss layer???

ParitoshParmar commented 7 years ago

@tracycw I think they have provided the code for verification purpose, that's why it doesn't contain ranking loss layer. But here is a good starting point for ranking loss layer: https://github.com/wanji/caffe-sl