Closed shivanraptor closed 9 months ago
I have to downgrade to tokenizer 0.13.3
and transformers 4.28.0
in order to make the above codes to work.
That not expected, transformers supports tokenizers<=0.15
see here. Make sure to check the packages in your running environment. If you can send an snippet of the custom tokenizer class I can try to reproduce this if needed
Maybe it's about caching issue. The latest versions work fine now. Sorry for the confusion.
Here is my code to wrap my custom tokenizer with a
PreTrainedTokenizerFast
:It results in error:
I believe it's due to my recent update of
tokenizer 0.14.1
andtransformers 4.34.0
do not support this version.I tried to downgrade
tokenizer
to0.13.3
but thetransformers 4.34.0
does not support it. I guess I have to wait for the next update oftransformers
.