Open sinjohr opened 2 years ago
I want to use custom tokenizer and encoder trained from huggingface tokenizer.
After training the huggingface tokenizer, I got a json containing vocas.
However, I don't know how to feed this custom tokenizer with train_finetune.py.
Could you give some guide to set and use custom tokenizer?
My problem is the same as yours. Please reply me if you solve it. Thank you
I want to use custom tokenizer and encoder trained from huggingface tokenizer.
After training the huggingface tokenizer, I got a json containing vocas.
However, I don't know how to feed this custom tokenizer with train_finetune.py.
Could you give some guide to set and use custom tokenizer?