Closed educob closed 5 years ago
Hi.
while idx + batch_size <= len(loader.dataset): batch, _ = iter(loader).next() if tta: fliped = hflip_batch(batch) emb_batch = backbone(batch.to(device)).cpu() + backbone(fliped.to(device)).cpu() features[idx:idx + batch_size] = l2_norm(emb_batch) else: 75: features[idx:idx + batch_size] = backbone(batch.to(device)).cpu() idx += batch_size if idx < len(loader.dataset): batch, _ = iter(loader).next() if tta: fliped = hflip_batch(batch) emb_batch = backbone(batch.to(device)).cpu() + backbone(fliped.to(device)).cpu() features[idx:] = l2_norm(emb_batch) else: features[idx:] = l2_norm(backbone(batch.to(device)).cpu())
Line 75 seems to have missing l2_norm as the other 3 cases have.
I'd thank an explanation why l2_norm is necessary. If the aim is to search in the database who that person is, normalizing doesn't seem a good idea.
Thanks.
Thank you. We have updated accordingly:)
Hi.
Line 75 seems to have missing l2_norm as the other 3 cases have.
I'd thank an explanation why l2_norm is necessary. If the aim is to search in the database who that person is, normalizing doesn't seem a good idea.
Thanks.