calcriterion uses a different number of features for training-free and finetuning.
For finetuning it returns:
`, indices = torch.topk(criterion, k=min(cfg['training_feat_num'],len(criterion))) However, APE_training callsself.indices = cal_criterion(cfg, clip_weights, cache_keys)where the argumenttraining_freedefaults toTrue, hence the wrong configuration is used, eventually causing a dimension mismatch new_cache_keys[:, self.indices] = new_cache_keys [:, self.indices] + res_keys
RuntimeError: The size of tensor a (800) must match the size of tensor b (900) at non-singleton dimension 1`
calcriterion uses a different number of features for training-free and finetuning. For finetuning it returns: `, indices = torch.topk(criterion, k=min(cfg['training_feat_num'],len(criterion)))
However, APE_training calls
self.indices = cal_criterion(cfg, clip_weights, cache_keys)where the argument
training_freedefaults to
True, hence the wrong configuration is used, eventually causing a dimension mismatch
new_cache_keys[:, self.indices] = new_cache_keys [:, self.indices] + res_keys RuntimeError: The size of tensor a (800) must match the size of tensor b (900) at non-singleton dimension 1`