Open LaurentGarcia opened 3 days ago
If you're trying to use pretrained weights, ensure that the architecture of your model exactly matches the one used to train the checkpoint. This might involve using an older or different version of the model architecture, or using a checkpoint trained for the specific version of your model. :)
Not sure if you can enlight us 👍
@KeyuWu-CS I really need help here 💯
There are several suggestions:
I am getting the following error, not sure if it's because there is a mistmach in resolution between different assets.
May I ask if you can share your \MonoHair\assets\data ?. It would be great to have a list of requirements of that folder and also de resolutions recommended :)
Start calculating hair masks! Traceback (most recent call last): File "C:\Users\Lauren\Documents\Source\MonoHair\prepare_data.py", line 182, in
calculate_mask(segment_args)
File "C:\Users\Lauren\Documents\Source\MonoHair\preprocess_capture_data\calc_masks.py", line 180, in calculate_mask
model.load_state_dict(state_dict)
File "C:\Users\Lauren\miniconda3\envs\ML\lib\site-packages\torch\nn\modules\module.py", line 2041, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for ResNet:
Unexpected key(s) in state_dict: "layer5.stages.0.2.weight", "layer5.stages.0.2.bias", "layer5.stages.0.2.running_mean", "layer5.stages.0.2.running_var", "layer5.stages.1.2.weight", "layer5.stages.1.2.bias", "layer5.stages.1.2.running_mean", "layer5.stages.1.2.running_var", "layer5.stages.2.2.weight", "layer5.stages.2.2.bias", "layer5.stages.2.2.running_mean", "layer5.stages.2.2.running_var", "layer5.stages.3.2.weight", "layer5.stages.3.2.bias", "layer5.stages.3.2.running_mean", "layer5.stages.3.2.running_var", "layer5.bottleneck.1.weight", "layer5.bottleneck.1.bias", "layer5.bottleneck.1.running_mean", "layer5.bottleneck.1.running_var", "edge_layer.conv1.1.weight", "edge_layer.conv1.1.bias", "edge_layer.conv1.1.running_mean", "edge_layer.conv1.1.running_var", "edge_layer.conv2.1.weight", "edge_layer.conv2.1.bias", "edge_layer.conv2.1.running_mean", "edge_layer.conv2.1.running_var", "edge_layer.conv3.1.weight", "edge_layer.conv3.1.bias", "edge_layer.conv3.1.running_mean", "edge_layer.conv3.1.running_var", "layer6.conv1.1.weight", "layer6.conv1.1.bias", "layer6.conv1.1.running_mean", "layer6.conv1.1.running_var", "layer6.conv2.1.weight", "layer6.conv2.1.bias", "layer6.conv2.1.running_mean", "layer6.conv2.1.running_var", "layer6.conv3.1.weight", "layer6.conv3.1.bias", "layer6.conv3.1.running_mean", "layer6.conv3.1.running_var", "layer6.conv3.3.weight", "layer6.conv3.3.bias", "layer6.conv3.3.running_mean", "layer6.conv3.3.running_var", "layer7.1.weight", "layer7.1.bias", "layer7.1.running_mean", "layer7.1.running_var".
size mismatch for layer6.conv2.0.weight: copying a param with shape torch.Size([48, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([48, 256, 3, 3]).
size mismatch for layer6.conv3.0.weight: copying a param with shape torch.Size([256, 304, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 304, 3, 3]).
size mismatch for layer7.0.weight: copying a param with shape torch.Size([256, 1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1024, 3, 3]).
(ML) C:\Users\Lauren\Documents\Source\MonoHair>