JayYip / m3tl

BERT for Multitask Learning
https://jayyip.github.io/m3tl/
Apache License 2.0
545 stars 125 forks source link

'TFBertEmbeddings' object has no attribute 'word_embeddings' #87

Closed wwewwt closed 2 years ago

wwewwt commented 3 years ago

代码: from bert_multitask_learning import train_bert_multitask, eval_bert_multitask, predict_bert_multitask problem_type_dict = {'toy_cls': 'cls', 'toy_seq_tag': 'seq_tag'}

problem = 'toy_cls&toy_seq_tag' model = train_bert_multitask( problem=problem, num_epochs=1, problem_type_dict=problem_type_dict, processing_fn_dict=processing_fn_dict,

continue_training=True

)

报错: /root/.local/lib/python3.7/site-packages/bert_multitask_learning/run_bert_multitask.py in train_bert_multitask(problem, num_gpus, num_epochs, model_dir, params, problem_type_dict, processing_fn_dict, model, create_tf_record_only, steps_per_epoch, warmup_ratio, continue_training, mirrored_strategy) 257 258 model = create_keras_model( --> 259 mirrored_strategy=mirrored_strategy, params=params, mode=mode, inputs_to_build_model=one_batch) 260 261 _train_bert_multitask_keras_model(

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/run_bert_multitask.py in create_keras_model(mirrored_strategy, params, mode, inputs_to_build_model, model) 91 if mirrored_strategy is not None: 92 with mirrored_strategy.scope(): ---> 93 model = _get_model_wrapper(params, mode, inputs_to_build_model, model) 94 else: 95 model = _get_model_wrapper(params, mode, inputs_to_build_model, model)

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/run_bert_multitask.py in _get_model_wrapper(params, mode, inputs_to_build_model, model) 51 def _get_model_wrapper(params, mode, inputs_to_build_model, model): 52 if model is None: ---> 53 model = BertMultiTask(params) 54 # model.run_eagerly = True 55 if mode == 'resume':

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/model_fn.py in init(self, params, name) 261 self.params = params 262 # initialize body model, aka transformers --> 263 self.body = BertMultiTaskBody(params=self.params) 264 # mlm might need word embedding from bert 265 # build sub-model

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/model_fn.py in init(self, params, name) 63 super(BertMultiTaskBody, self).init(name=name) 64 self.params = params ---> 65 self.bert = MultiModalBertModel(params=self.params) 66 if self.params.custom_pooled_hidden_size: 67 self.custom_pooled_layer = tf.keras.layers.Dense(

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/modeling.py in init(self, params, use_one_hot_embeddings) 40 # multimodal input dense 41 embedding_dim = get_embedding_table_from_model( ---> 42 self.bert_model).shape[-1] 43 self.modal_name_list = ['image', 'others'] 44 self.multimodal_dense = {modal_name: tf.keras.layers.Dense(

/root/.local/lib/python3.7/site-packages/bert_multitask_learning/utils.py in get_embedding_table_from_model(model) 397 def get_embedding_table_from_model(model): 398 base_model = get_transformer_main_model(model) --> 399 return base_model.embeddings.word_embeddings 400 401

AttributeError: 'TFBertEmbeddings' object has no attribute 'word_embeddings'

JayYip commented 3 years ago

Could you please inform your transformers version? This is most likely caused by incompatible transformers version.

DM-NUM commented 2 years ago

Could you please inform your transformers version? This is most likely caused by incompatible transformers version.

I had got the same problem , and tried to pip install transformers==4.6.0, but also get the same error. torch==1.7.1 tensorflow==2.4.1 keras==2.6.0

DM-NUM commented 2 years ago

Could you please inform your transformers version? This is most likely caused by incompatible transformers version.

I had got the same problem , and tried to pip install transformers==4.6.0, but also get the same error. torch==1.7.1 tensorflow==2.4.1 keras==2.6.0

uh.. I sovled the question with pip3 install transormers==4.1.1. Why transformers==4.6.0 failed?

JayYip commented 2 years ago

Seems the reason is transformer update breaks some internal APIs.

Closing the issue.