liuwei1206 / LEBERT

Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
336 stars 60 forks source link

AttributeError: 'BertConfig' object has no attribute 'add_layers' #21

Closed BaoyanWang closed 3 years ago

BaoyanWang commented 3 years ago

大佬,最近看到了您的文章,在复现的时候时候遇到如下问题,我是单机运行的,求指教。。。

2021-07-02 02:46:18:INFO: Process rank: -1, device: cpu, n_gpu: 0, distributed training: False, 16-bits training: False 2021-07-02 02:46:18:INFO: Training/evaluation parameters Namespace(adam_epsilon=1e-08, config_name='data/berts/bert/config.json', data_dir='data/dataset/NER/weibo', default_label='O', device=device(type='cpu'), do_eval=True, do_predict=True, do_shuffle=True, do_train=True, evaluate_during_training=True, fp16=False, fp16_opt_level='O1', gradient_accumulation_steps=1, label_file='data/dataset/NER/weibo/labels.txt', learning_rate=1e-05, local_rank=-1, logging_dir='data/log', logging_steps=100, max_grad_norm=1.0, max_scan_num=1000000, max_seq_length=256, max_steps=-1, max_word_num=5, model_name_or_path='data/berts/bert/pytorch_model.bin', model_type='WCBertCRF_Token', n_gpu=0, no_cuda=False, nodes=1, num_train_epochs=20, output_dir='data/dataset/NER/output', overwrite_cache=True, per_gpu_eval_batch_size=16, per_gpu_train_batch_size=4, save_steps=600, save_total_limit=50, saved_embedding_dir='data/dataset/NER/weibo', seed=106524, sgd_momentum=0.9, vocab_file='data/berts/bert/vocab.txt', warmup_steps=190, weight_decay=0.0, word_embed_dim=200, word_embedding='data/Tencent_AILab_ChineseEmbedding.txt', word_vocab_file='data/tencent_vocab.txt') ['data/tencent_vocab.txt'] Calling BertTokenizer.from_pretrained() with the path to a single file or url is deprecated Traceback (most recent call last): File "Trainer.py", line 597, in main() File "Trainer.py", line 547, in main num_labels=label_vocab.get_item_size()) File "/opt/conda/lib/python3.7/site-packages/transformers/modeling_utils.py", line 947, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/datadrive/LEBERT/wcbert_modeling.py", line 466, in init self.bert = WCBertModel(config) File "/datadrive/LEBERT/wcbert_modeling.py", line 324, in init self.encoder = BertEncoder(config) File "/datadrive/LEBERT/wcbert_modeling.py", line 207, in init self.add_layers = config.add_layers AttributeError: 'BertConfig' object has no attribute 'add_layers'

yiwenJG commented 3 years ago

Modify your config.json file of Bert, simply add: "word_embed_dim": 200, "add_layers": [0], "HP_dropout": 0.5

liuwei1206 commented 3 years ago

Hi,

Sorry to reply so late. I would like to recommend you to read the code of BERT in detail. You will learn a lot from it.

Best