iamhankai / attribute-aware-attention

[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
https://arxiv.org/abs/1901.00392
156 stars 30 forks source link

emb_dim in code is 512 but not 2048? #5

Closed prologueyan-hr closed 5 years ago

iamhankai commented 5 years ago

https://github.com/iamhankai/attribute-aware-attention/blob/ab3210705524bcb56f0121de0bcd47f3f93fb6c5/cub_demo.py#L39

512 is for VGG-16, 2048 is for ResNet-50.

prologueyan-hr commented 5 years ago

I mean emb_dim emb_dim = 512

iamhankai commented 5 years ago

I mean emb_dim emb_dim = 512

Sorry, my fault. The length of embedding is not important. The performance is similar.