I want to add new tokens to the dictionary when fine-tuning the pre-trained Albert model. For example, in Bert, we can insert new custom tokens into the dictionary in place of [unknown] tokens:
[PAD]
[unused0]
[unused1]
[unused2]
[unused3]
[unused4]
...
But in Albert, I did not find such an opportunity, since the dictionary(30k-clean.vocab) is completely filled.
I want to add new tokens to the dictionary when fine-tuning the pre-trained Albert model. For example, in Bert, we can insert new custom tokens into the dictionary in place of [unknown] tokens: [PAD] [unused0] [unused1] [unused2] [unused3] [unused4] ... But in Albert, I did not find such an opportunity, since the dictionary(30k-clean.vocab) is completely filled.