Hi, i have a electra pre-trained model, it's not the official electra base/small/large, it's a chinese pretrained model from here https://github.com/ymcui/Chinese-ELECTRA. what I want to do now is to keep pre-train from this pre-trained model. But it seems i cant find how to set initial checkpoint in pre-train. For now, the command i am using is: python3 run_pretraining.py --data-dir pretrain_chinese_model --model-name chinese_model.
I put the checkpoint of this checkpoint of this Chinese model under "./pretrain_chinese_model"
it seems the pretraining started successfully, and i got a chinese_model under "./pretrain_chinese_model/models/"
i was wondering am i doing right? i don't want it train from scratch, but i cant find a obvious "init-checkpoint " flags in electra. Thanks for your help, really appreciate.
Hi, i have a electra pre-trained model, it's not the official electra base/small/large, it's a chinese pretrained model from here https://github.com/ymcui/Chinese-ELECTRA. what I want to do now is to keep pre-train from this pre-trained model. But it seems i cant find how to set initial checkpoint in pre-train. For now, the command i am using is: python3 run_pretraining.py --data-dir pretrain_chinese_model --model-name chinese_model. I put the checkpoint of this checkpoint of this Chinese model under "./pretrain_chinese_model" it seems the pretraining started successfully, and i got a chinese_model under "./pretrain_chinese_model/models/"
i was wondering am i doing right? i don't want it train from scratch, but i cant find a obvious "init-checkpoint " flags in electra. Thanks for your help, really appreciate.