size mismatch for final_logits_bias: copying a param with shape torch.Size([1, 54953]) from checkpoint, the shape in current model is torch.Size([1, 54944]). size mismatch for model.shared.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]).
size mismatch for model.encoder.embed_tokens.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]).
size mismatch for model.decoder.embed_tokens.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]).
size mismatch for lm_head.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]).
size mismatch for final_logits_bias: copying a param with shape torch.Size([1, 54953]) from checkpoint, the shape in current model is torch.Size([1, 54944]). size mismatch for model.shared.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]). size mismatch for model.encoder.embed_tokens.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]). size mismatch for model.decoder.embed_tokens.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]). size mismatch for lm_head.weight: copying a param with shape torch.Size([54953, 512]) from checkpoint, the shape in current model is torch.Size([54944, 512]).
我把从 https://huggingface.co/lsy641/ESC_Blender_noStrategy/tree/main 的模型放到blend-small文件夹下面之后,运行Python BlenderEmotionalSupport.py 文件, 遇到了这个维度不匹配的错误,请问这是怎么回事呢?