Hi there,
Could I know how to fine-tune a new dataset with a larger word dictionary?
The current model has a 3129 size word/answer dictionary while ours expended the 3129 to 50085 (our dictionary's word index from 0 to 3129 are the same as the original dictionary)
I currently have this error:
size mismatch for classifier.weight: copying a param with shape torch.Size([3129, 1024]) from checkpoint, the shape in current model is torch.Size([50085, 1024]).
size mismatch for classifier.bias: copying a param with shape torch.Size([3129]) from checkpoint, the shape in current model is torch.Size([50085]).
Hi there, Could I know how to fine-tune a new dataset with a larger word dictionary?
The current model has a 3129 size word/answer dictionary while ours expended the 3129 to 50085 (our dictionary's word index from 0 to 3129 are the same as the original dictionary)
I currently have this error: size mismatch for classifier.weight: copying a param with shape torch.Size([3129, 1024]) from checkpoint, the shape in current model is torch.Size([50085, 1024]). size mismatch for classifier.bias: copying a param with shape torch.Size([3129]) from checkpoint, the shape in current model is torch.Size([50085]).
Thanks in advance!