MhLiao / MaskTextSpotterV3

The code of "Mask TextSpotter v3: Segmentation Proposal Network for Robust Scene Text Spotting"
Other
622 stars 122 forks source link

Error in loading state dict #53

Open kriti-agrawal-52 opened 3 years ago

kriti-agrawal-52 commented 3 years ago

Hello, I am training on a custom dataset with 80 classes. I have updated the config file for CHAR_NUM_CLASSES, NUM_CHAR, and set RESUME and CHAR_MASK_ON to False. I am using the pre_train model for finetuning. I have also updated char_to_num and num_to_char functions and added the class for new dataset in maskrcnn_benchmark folder and well as updating the paths_catalog.py file. I am getting the error

RuntimeError: Error(s) in loading state_dict for DistributedDataParallel: size mismatch for module.roi_heads.mask.predictor.seq.seq_decoder.embedding.weight: copying a param with shape torch.Size([38, 38]) from checkpoint, the shape in current model is torch.Size([83, 83]). size mismatch for module.roi_heads.mask.predictor.seq.seq_decoder.word_linear.weight: copying a param with shape torch.Size([256, 38]) from checkpoint, the shape in current model is torch.Size([256, 83]). size mismatch for module.roi_heads.mask.predictor.seq.seq_decoder.out.weight: copying a param with shape torch.Size([38, 256]) from checkpoint, the shape in current model is torch.Size([83, 256]). size mismatch for module.roi_heads.mask.predictor.seq.seq_decoder.out.bias: copying a param with shape torch.Size([38]) from checkpoint, the shape in current model is torch.Size([83]).

Please tell me how I can fix this error. Thank you