Closed KangChou closed 2 years ago
Thanks for your interest in our work! I think the error is due to differences between different versions of transformers (I guess you use transformer version 4.X). You can resolve this error in two ways:
(a). Using transformers version 2.1.1 or (b). Modify the code to make it work for the transformer version you are using. For example, you can:
basic_bert_unit/Read_data_func.py
to encode_indexs = Tokenizer.encode(string, truncation = True, max_length = des_limit, add_special_tokens = False)
basic_bert_unit/Read_data_func.py
to token_ids = Tokenizer.encode(ent_name, truncation = True, max_length = ent_name_max_length, add_special_tokens = False)
basic_bert_unit/Basic_Bert_Unit_model.py
to x = self.bert_model(input_ids = batch_word_list,attention_mask = attention_mask, return_dict = False)
interaction_model/Basic_Bert_Unit_model.py
to x = self.bert_model(input_ids = batch_word_list,attention_mask = attention_mask, return_dict = False)
interaction_model/get_attributeValue_embedding.py
to token_ids = Tokenizer.encode(v,truncation = True,add_special_tokens=True,max_length=max_length)
If you have any questions, please feel free to contact me.