MeRajat / SolvingAlmostAnythingWithBert

BioBert Pytorch
116 stars 33 forks source link

BertConfig object has no attribute 'layer_norm_eps' #10

Open mayinghan opened 5 years ago

mayinghan commented 5 years ago

Hi, I downloaded the weights from https://github.com/naver/biobert-pretrained/releases (Pre-trained weight of BioBERT v1.1 (+PubMed 1M)) and I used the bert_config.json inside the downloaded file. I was trying to load the config file into Transformers(2.1.1) bert by using from_pretrained() method. For getting the state dictionary, I used the exact same method you used in your code. However, the program raised this issue: 'BertConfig' object has no attribute 'layer_norm_eps'

Seems like the parameters inside the downloaded bert_config file is incomplete. How can I get a full version of config?

Here is how I load it using from_pretrained method

tokenizer = BertTokenizer(vocab_file=bioparameter.VOCAB_FILE, do_lower_case=False)
        # set up biobert model
        tmp_d = torch.load(bioparameter.BERT_WEIGHTS, map_location='cpu')
        state_dict = OrderedDict()
        for i in list(tmp_d.keys())[:199]:
            x = i
            if i.find('bert') > 1:
                x = '.'.join(i.split('.')[1:])
            state_dict[x] = tmp_d[i]

        config = BertConfig(vocab_size_or_config_json_file=bioparameter.BERT_CONFIG_FILE)
        model = model_class.from_pretrained(None, config=config, state_dict=state_dict)