Closed Steven-jiaqi closed 1 year ago
Hello, the issue is that, to do eval on a model CCT+SeqVLAD, you have the pass the same parameters that you would need to pass if you were the train that architecture. In practice, the command has to be the following:
python main_scripts/evaluation.py --resume model_path --img_shape 384 384 --trunc_te 8 --freeze_te 1 --arch cct384 --aggregation seqvlad --dataset_path /path/msls_reformat --seq_length 5
Again, I plan to make all of these arguments automatic in my next commit
When I run the 'Evaluate trained models', the error happened: RuntimeError: Error(s) in loading state_dict for TVGNet: Missing key(s) in state_dict: "encoder.0.weight", "encoder.1.weight", "encoder.1.bias", "encoder.1.running_mean", "encoder.1.running_var", "encoder.4.0.conv1.weight", "encoder.4.0.bn1.weight", "encoder.4.0.bn1.bias", "encoder.4.0.bn1.running_mean", "encoder.4.0.bn1.running_var", "encoder.4.0.conv2.weight", "encoder.4.0.bn2.weight", "encoder.4.0.bn2.bias", "encoder.4.0.bn2.running_mean", "encoder.4.0.bn2.running_var", "encoder.4.1.conv1.weight", "encoder.4.1.bn1.weight", "encoder.4.1.bn1.bias", "encoder.4.1.bn1.running_mean", "encoder.4.1.bn1.running_var", "encoder.4.1.conv2.weight", "encoder.4.1.bn2.weight", "encoder.4.1.bn2.bias", "encoder.4.1.bn2.running_mean", "encoder.4.1.bn2.running_var", "encoder.5.0.conv1.weight", "encoder.5.0.bn1.weight", "encoder.5.0.bn1.bias", "encoder.5.0.bn1.running_mean", "encoder.5.0.bn1.running_var", "encoder.5.0.conv2.weight", "encoder.5.0.bn2.weight", "encoder.5.0.bn2.bias", "encoder.5.0.bn2.running_mean", "encoder.5.0.bn2.running_var", "encoder.5.0.downsample.0.weight", "encoder.5.0.downsample.1.weight", "encoder.5.0.downsample.1.bias", "encoder.5.0.downsample.1.running_mean", "encoder.5.0.downsample.1.running_var", "encoder.5.1.conv1.weight", "encoder.5.1.bn1.weight", "encoder.5.1.bn1.bias", "encoder.5.1.bn1.running_mean", "encoder.5.1.bn1.running_var", "encoder.5.1.conv2.weight", "encoder.5.1.bn2.weight", "encoder.5.1.bn2.bias", "encoder.5.1.bn2.running_mean", "encoder.5.1.bn2.running_var", "encoder.6.0.conv1.weight", "encoder.6.0.bn1.weight", "encoder.6.0.bn1.bias", "encoder.6.0.bn1.running_mean", "encoder.6.0.bn1.running_var", "encoder.6.0.conv2.weight", "encoder.6.0.bn2.weight", "encoder.6.0.bn2.bias", "encoder.6.0.bn2.running_mean", "encoder.6.0.bn2.running_var", "encoder.6.0.downsample.0.weight", "encoder.6.0.downsample.1.weight", "encoder.6.0.downsample.1.bias", "encoder.6.0.downsample.1.running_mean", "encoder.6.0.downsample.1.running_var", "encoder.6.1.conv1.weight", "encoder.6.1.bn1.weight", "encoder.6.1.bn1.bias", "encoder.6.1.bn1.running_mean", "encoder.6.1.bn1.running_var", "encoder.6.1.conv2.weight", "encoder.6.1.bn2.weight", "encoder.6.1.bn2.bias", "encoder.6.1.bn2.running_mean", "encoder.6.1.bn2.running_var". Unexpected key(s) in state_dict: "encoder.tokenizer.conv_layers.0.0.weight", "encoder.tokenizer.conv_layers.1.0.weight", "encoder.classifier.positional_emb", "encoder.classifier.attention_pool.weight", "encoder.classifier.attention_pool.bias", "encoder.classifier.blocks.0.pre_norm.weight", "encoder.classifier.blocks.0.pre_norm.bias", "encoder.classifier.blocks.0.self_attn.qkv.weight", "encoder.classifier.blocks.0.self_attn.proj.weight", "encoder.classifier.blocks.0.self_attn.proj.bias", "encoder.classifier.blocks.0.linear1.weight", "encoder.classifier.blocks.0.linear1.bias", "encoder.classifier.blocks.0.norm1.weight", "encoder.classifier.blocks.0.norm1.bias", "encoder.classifier.blocks.0.linear2.weight", "encoder.classifier.blocks.0.linear2.bias", "encoder.classifier.blocks.1.pre_norm.weight", "encoder.classifier.blocks.1.pre_norm.bias", "encoder.classifier.blocks.1.self_attn.qkv.weight", "encoder.classifier.blocks.1.self_attn.proj.weight", "encoder.classifier.blocks.1.self_attn.proj.bias", "encoder.classifier.blocks.1.linear1.weight", "encoder.classifier.blocks.1.linear1.bias", "encoder.classifier.blocks.1.norm1.weight", "encoder.classifier.blocks.1.norm1.bias", "encoder.classifier.blocks.1.linear2.weight", "encoder.classifier.blocks.1.linear2.bias", "encoder.classifier.blocks.2.pre_norm.weight", "encoder.classifier.blocks.2.pre_norm.bias", "encoder.classifier.blocks.2.self_attn.qkv.weight", "encoder.classifier.blocks.2.self_attn.proj.weight", "encoder.classifier.blocks.2.self_attn.proj.bias", "encoder.classifier.blocks.2.linear1.weight", "encoder.classifier.blocks.2.linear1.bias", "encoder.classifier.blocks.2.norm1.weight", "encoder.classifier.blocks.2.norm1.bias", "encoder.classifier.blocks.2.linear2.weight", "encoder.classifier.blocks.2.linear2.bias", "encoder.classifier.blocks.3.pre_norm.weight", "encoder.classifier.blocks.3.pre_norm.bias", "encoder.classifier.blocks.3.self_attn.qkv.weight", "encoder.classifier.blocks.3.self_attn.proj.weight", "encoder.classifier.blocks.3.self_attn.proj.bias", "encoder.classifier.blocks.3.linear1.weight", "encoder.classifier.blocks.3.linear1.bias", "encoder.classifier.blocks.3.norm1.weight", "encoder.classifier.blocks.3.norm1.bias", "encoder.classifier.blocks.3.linear2.weight", "encoder.classifier.blocks.3.linear2.bias", "encoder.classifier.blocks.4.pre_norm.weight", "encoder.classifier.blocks.4.pre_norm.bias", "encoder.classifier.blocks.4.self_attn.qkv.weight", "encoder.classifier.blocks.4.self_attn.proj.weight", "encoder.classifier.blocks.4.self_attn.proj.bias", "encoder.classifier.blocks.4.linear1.weight", "encoder.classifier.blocks.4.linear1.bias", "encoder.classifier.blocks.4.norm1.weight", "encoder.classifier.blocks.4.norm1.bias", "encoder.classifier.blocks.4.linear2.weight", "encoder.classifier.blocks.4.linear2.bias", "encoder.classifier.blocks.5.pre_norm.weight", "encoder.classifier.blocks.5.pre_norm.bias", "encoder.classifier.blocks.5.self_attn.qkv.weight", "encoder.classifier.blocks.5.self_attn.proj.weight", "encoder.classifier.blocks.5.self_attn.proj.bias", "encoder.classifier.blocks.5.linear1.weight", "encoder.classifier.blocks.5.linear1.bias", "encoder.classifier.blocks.5.norm1.weight", "encoder.classifier.blocks.5.norm1.bias", "encoder.classifier.blocks.5.linear2.weight", "encoder.classifier.blocks.5.linear2.bias", "encoder.classifier.blocks.6.pre_norm.weight", "encoder.classifier.blocks.6.pre_norm.bias", "encoder.classifier.blocks.6.self_attn.qkv.weight", "encoder.classifier.blocks.6.self_attn.proj.weight", "encoder.classifier.blocks.6.self_attn.proj.bias", "encoder.classifier.blocks.6.linear1.weight", "encoder.classifier.blocks.6.linear1.bias", "encoder.classifier.blocks.6.norm1.weight", "encoder.classifier.blocks.6.norm1.bias", "encoder.classifier.blocks.6.linear2.weight", "encoder.classifier.blocks.6.linear2.bias", "encoder.classifier.blocks.7.pre_norm.weight", "encoder.classifier.blocks.7.pre_norm.bias", "encoder.classifier.blocks.7.self_attn.qkv.weight", "encoder.classifier.blocks.7.self_attn.proj.weight", "encoder.classifier.blocks.7.self_attn.proj.bias", "encoder.classifier.blocks.7.linear1.weight", "encoder.classifier.blocks.7.linear1.bias", "encoder.classifier.blocks.7.norm1.weight", "encoder.classifier.blocks.7.norm1.bias", "encoder.classifier.blocks.7.linear2.weight", "encoder.classifier.blocks.7.linear2.bias", "encoder.classifier.norm.weight", "encoder.classifier.norm.bias". size mismatch for aggregator.centroids: copying a param with shape torch.Size([64, 384]) from checkpoint, the shape in current model is torch.Size([64, 256]). size mismatch for aggregator.conv.weight: copying a param with shape torch.Size([64, 384, 1]) from checkpoint, the shape in current model is torch.Size([64, 256, 1, 1]). I did not make any changes to the code. I am currently unable to fix this error. I'm sorry to bother you. Hope to hear from you soon!