Closed ericlormul closed 3 years ago
Have you solved it? I have meet the same problem.
This error is due to using the convert_tf_checkpoint_to_pytorch
script directly which has been written to convert Google's pre-trained BERT models (which are initialised as BertForPreTraining
objects). The models we have released are objects of BertForSequenceClassification
and thus there are some changes that need to be made to the convert_tf_checkpoint_to_pytorch
script to make it work correctly.
The same issues was raised on the huggingface transformers repository, and can you can refer here for the solution.
Step1: Clone the pytorch-pretrained-BERT
toolkit from here --> pytorch-pretrained-BERT
Step2: Manually change the convert_tf_checkpoint_to_pytorch.py
script as shared here --> Personal Manually Changed Script
Step3: Run in the terminal python convert_tf_checkpoint_to_pytorch.py --tf_checkpoint_path [your TF model ckpt] --bert_config_file [your TF config] --pytorch_dump_path [target save path of pytorch model]
Step4: Get something delicious for celebration!
Step1: Clone the
pytorch-pretrained-BERT
toolkit from here --> pytorch-pretrained-BERT Step2: Manually change theconvert_tf_checkpoint_to_pytorch.py
script as shared here --> Personal Manually Changed Script Step3: Run in the terminalpython convert_tf_checkpoint_to_pytorch.py --tf_checkpoint_path [your TF model ckpt] --bert_config_file [your TF config] --pytorch_dump_path [target save path of pytorch model]
Step4: Get something delicious for celebration!
hi,同学,请问你使用转换后的pytorch权重可以复现出作者在wikiQA上的表现吗?我加载权重进行预测指标相差很多,特来请教~
I used following script to convert pre-trained model tanda_bert_base_asnq
but it gives error:
I'm using transformers 2.3.0 and pytorch 1.2.0
Thanks!