Adding special tokens via tokenizer.add_special_tokens({"additional_special_tokens": added_tokens}) adds special tokens to the end.
For some tokenizers, the ids were not updated and therefore resulted in index errors when used.
This PR correctly updates the ids of the special tokens for all implemented tokenizers
Adding special tokens via
tokenizer.add_special_tokens({"additional_special_tokens": added_tokens})
adds special tokens to the end. For some tokenizers, the ids were not updated and therefore resulted in index errors when used.This PR correctly updates the ids of the special tokens for all implemented tokenizers