I want to do classification with Chinese texts. Though there is a BERT Chinese pre-trained model by default, it's quite out-dated. I would like to select different pre-trained models, like wwm BERT, XLnet and RoBERTa which trained on Chinese corpus. They have already been converted into pytorch-transformers format. Can I use them with simpletransformers? Thanks!
I want to do classification with Chinese texts. Though there is a BERT Chinese pre-trained model by default, it's quite out-dated. I would like to select different pre-trained models, like wwm BERT, XLnet and RoBERTa which trained on Chinese corpus. They have already been converted into pytorch-transformers format. Can I use them with simpletransformers? Thanks!