Panxjia / SPA_CVPR2021

The official implementaion of SPA_CVPR2021 paper
MIT License
45 stars 5 forks source link

something about inference batchsize #4

Open chi0612 opened 2 years ago

chi0612 commented 2 years ago

Can batchsize only be set to 1 in the inference phase? An error is reported when set to other values

Traceback (most recent call last): File "val_spa.py", line 422, in val(args) File "val_spa.py", line 246, in val prec1_1, prec5_1 = evaluate.accuracy(cls_logits.cpu().data, label_in.long(), topk=(1, 5)) File "../utils/evaluate.py", line 19, in accuracy correct = pred.eq(target.view(1, -1).expand_as(pred)) RuntimeError: The expanded size of the tensor (1) must match the existing size (32) at non-singleton dimension 1. Target sizes: [5, 1]. Tensor sizes: [1, 32]

Panxjia commented 2 years ago

Thanks for your attention. This version can only set the batch to 1 for inference due to the generation of HSM. We will update a version supporting batchsize > 1 soon.