marcovzla / axtk

MIT License
0 stars 1 forks source link

[#1] Handle mismatch between model.config.vocab_size and tokenizer.vocab_size #2

Closed limjiayi closed 11 months ago

limjiayi commented 1 year ago

It's also possible to avoid the error described in the #1 by resizing the model's token embeddings, feel free to close this PR if that's the preferred solution.

model.resize_token_embeddings(tokenizer.vocab_size)
marcovzla commented 11 months ago

Thank you!