dbiir / UER-py

Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
https://github.com/dbiir/UER-py/wiki
Apache License 2.0
2.97k stars 528 forks source link

Can we use bert wwm pretrain model directly ? #32

Open freedomRen opened 4 years ago

freedomRen commented 4 years ago

In readme file there have a link https://github.com/ymcui/Chinese-BERT-wwm , we can download wwm model , after unzip it, we can find vocab.txt pytorch_model.bin and bert_config.json . So the question is can we just add these to the configuration parameters and use them directly? I tried a task that could use them directly but the accuracy was very low.

zhezhaoa commented 4 years ago

Could you provide more details about your way of using UER or pre-trained models provided by UER? We provide conversion scripts in our project. You need to convert the pre-trained models in UER with convert_bert_from_uer_to_xxx.py if you use them in other projects (Google BERT and Transformers from Huggingface). Similarly, you need to convert the pre-trained models in other projects with convert_bert_from_xxx_to_uer.py if you use them in UER project.