I was training and evaluating the model when I noticed something weird. On line 223 in train_CMC.py, the resnet model output two feature tensors, each of size (batch size, 128) as expected. However, once I save that resnet model and run it again in LinearProbing.py, the feature tensors (line 233 in LinearProbing.py) suddenly become (batch size, 1024, 7, 7) of size. I'm sure the models and checkpoints are consistent since I never changed them. Is there something I am missing here? Thanks!
Let me answer my own question. There is a difference between the layer parameter in train_CMC.py and LinearProbing.py. The former uses all 7 layers and the latter only uses 5 layers. Oops
Hi all,
I was training and evaluating the model when I noticed something weird. On line 223 in train_CMC.py, the resnet model output two feature tensors, each of size (batch size, 128) as expected. However, once I save that resnet model and run it again in LinearProbing.py, the feature tensors (line 233 in LinearProbing.py) suddenly become (batch size, 1024, 7, 7) of size. I'm sure the models and checkpoints are consistent since I never changed them. Is there something I am missing here? Thanks!