microsoft / AzureML-BERT

End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
https://azure.microsoft.com/en-us/blog/microsoft-makes-it-easier-to-build-popular-language-representation-model-bert-at-large-scale/
MIT License
394 stars 127 forks source link

Initializing BertMultiTask model error #25

Closed nigaregr closed 5 years ago

nigaregr commented 5 years ago

Hi, I am getting the following error when I run Bert pretraining: 09/09/2019 10:13:43 - INFO - logger - Vocabulary contains 30522 tokens 09/09/2019 10:13:43 - INFO - logger - Initializing BertMultiTask model Traceback (most recent call last): File "AzureML-BERT/pretrain/PyTorch/train_nitin.py", line 361, in summary_writer = summary_writer) File "/home/nigaregr/Documents/AzureML-BERT/pretrain/PyTorch/models.py", line 121, in init self.network.register_batch(BatchType.PRETRAIN_BATCH, "pretrain_dataset", loss_calculation=BertPretrainingLoss(self.bert_encoder, bert_config)) File "/home/nigaregr/Documents/AzureML-BERT/pretrain/PyTorch/models.py", line 25, in init self.cls = BertPreTrainingHeads(config, self.bert.embeddings.word_embeddings.weight) TypeError: init() takes 2 positional arguments but 3 were given

skaarthik commented 5 years ago

Are you using pytorch-pretrained-bert package v0.6.2?

nigaregr commented 5 years ago

Yes. That fixes it. Thanks