ShannonAI / mrc-for-flat-nested-ner

Code for ACL 2020 paper `A Unified MRC Framework for Named Entity Recognition`
643 stars 117 forks source link

复现不了结果(zh_msra数据) #47

Closed YiLing28 closed 3 years ago

YiLing28 commented 3 years ago

您好,在zh_msra数据上,好像跑不出来论文的结果,我看了https://github.com/ShannonAI/mrc-for-flat-nested-ner/issues/10https://github.com/ShannonAI/mrc-for-flat-nested-ner/issues/9这两个的issues,但没发现啥问题。下面是我的日志: Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex. Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex. Please notice that merge the args_dict and json_config ... ... { "bert_frozen": "false", "hidden_size": 768, "hidden_dropout_prob": 0.2, "classifier_sign": "multi_nonlinear", "clip_grad": 1, "bert_config": { "attention_probs_dropout_prob": 0.1, "directionality": "bidi", "hidden_act": "gelu", "hidden_dropout_prob": 0.1, "hidden_size": 768, "initializer_range": 0.02, "intermediate_size": 3072, "max_position_embeddings": 512, "num_attention_heads": 12, "num_hidden_layers": 12, "pooler_fc_size": 768, "pooler_num_attention_heads": 12, "pooler_num_fc_layers": 3, "pooler_size_per_head": 128, "pooler_type": "first_token_transform", "type_vocab_size": 2, "vocab_size": 21128 }, "config_path": "/opt/ldw/code/ner/mrc-for-flat-nested-ner/config/zh_bert.json", "data_dir": "/opt/ldw/code/ner/mrc-for-flat-nested-ner/data/zh_msra", "bert_model": "/opt/ldw/model/bert_model/chinese_L-12_H-768_A-12", "task_name": null, "max_seq_length": 128, "train_batch_size": 32, "dev_batch_size": 16, "test_batch_size": 16, "checkpoint": 300, "learning_rate": 1e-05, "num_train_epochs": 12, "warmup_proportion": -1.0, "max_grad_norm": 1.0, "gradient_accumulation_steps": 1, "seed": 2333, "output_dir": "/opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3", "data_sign": "zh_msra", "weight_start": 1.0, "weight_end": 1.0, "weight_span": 1.0, "entity_sign": "flat", "n_gpu": 1, "dropout": 0.3, "entity_threshold": 0.5, "num_data_processor": 1, "data_cache": true, "export_model": true, "do_lower_case": false, "fp16": false, "amp_level": "O2", "local_rank": -1 } -------------------- current data_sign: zh_msra label_list: ['NS', 'NR', 'NT', 'O'] ==================== loading train data ... ... 125184 125184 train data loaded ==================== loading dev data ... ... 13908 13908 dev data loaded ==================== loading test data ... ... 13095 13095 test data loaded ###################################################################### EPOCH: 0 ------------------------------ current training loss is : 0.01635975018143654 ............................................................ DEV: loss, acc, precision, recall, f1 0.0178 0.4048 0.164 0.3038 0.213 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_0_300.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0302 0.3752 0.1964 0.3022 0.238 ------------------------------ ------------------------------ current training loss is : 0.013055425137281418 ............................................................ DEV: loss, acc, precision, recall, f1 0.0117 0.7726 0.2742 0.3764 0.3173 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_0_600.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0186 0.6432 0.28 0.3907 0.3262 ------------------------------ ------------------------------ current training loss is : 0.006994003430008888 ............................................................ DEV: loss, acc, precision, recall, f1 0.01 0.5888 0.2829 0.6389 0.3921 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_0_900.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0146 0.567 0.3262 0.606 0.4241 ------------------------------ ------------------------------ current training loss is : 0.008853244595229626 ............................................................ DEV: loss, acc, precision, recall, f1 0.0076 0.5823 0.2714 0.5139 0.3552 ------------------------------ ------------------------------ current training loss is : 0.019887499511241913 ............................................................ DEV: loss, acc, precision, recall, f1 0.0067 0.5843 0.2714 0.5985 0.3735 ------------------------------ ------------------------------ current training loss is : 0.046563927084207535 ............................................................ DEV: loss, acc, precision, recall, f1 0.0066 0.6084 0.3426 0.7808 0.4762 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_0_1800.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0093 0.5949 0.3865 0.7403 0.5079 ------------------------------ ------------------------------ current training loss is : 4.87403231090866e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.301 0.2011 0.7299 0.3153 ------------------------------ ------------------------------ current training loss is : 0.010479566641151905 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.3032 0.2072 0.7416 0.3239 ------------------------------ ------------------------------ current training loss is : 0.0015295092016458511 ............................................................ DEV: loss, acc, precision, recall, f1 0.0067 0.8078 0.4964 0.782 0.6073 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_0_2700.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0085 0.7304 0.4809 0.7499 0.586 ------------------------------ ------------------------------ current training loss is : 0.006260382942855358 ............................................................ DEV: loss, acc, precision, recall, f1 0.0061 0.6053 0.3119 0.6876 0.4291 ------------------------------ ------------------------------ current training loss is : 0.005210643634200096 ............................................................ DEV: loss, acc, precision, recall, f1 0.0056 0.305 0.2231 0.7118 0.3397 ------------------------------ ------------------------------ current training loss is : 0.011097898706793785 ............................................................ DEV: loss, acc, precision, recall, f1 0.0053 0.3051 0.2115 0.6809 0.3228 ------------------------------ ------------------------------ current training loss is : 0.00706856045871973 ............................................................ DEV: loss, acc, precision, recall, f1 0.0051 0.8888 0.5676 0.635 0.5994 ------------------------------ ###################################################################### EPOCH: 1 current learning rate 9.5e-06 current learning rate 9.5e-06 ------------------------------ current training loss is : 0.0021997329313308 ............................................................ DEV: loss, acc, precision, recall, f1 0.0053 0.9174 0.7173 0.7799 0.7473 SAVED model path is : /opt/ldw/code/ner/mrc-for-flat-nested-ner/output_mrc_ner/zh_msra_128_1e-5_32_0.3/bert_finetune_model_1_300.bin ............................................................ TEST: loss, acc, precision, recall, f1 0.0075 0.8788 0.6943 0.7269 0.7102 ------------------------------ ------------------------------ current training loss is : 0.004434032365679741 ............................................................ DEV: loss, acc, precision, recall, f1 0.0052 0.8964 0.6686 0.7161 0.6915 ------------------------------ ------------------------------ current training loss is : 0.000860140600707382 ............................................................ DEV: loss, acc, precision, recall, f1 0.0049 0.9205 0.6924 0.7299 0.7106 ------------------------------ ------------------------------ current training loss is : 0.002716178074479103 ............................................................ DEV: loss, acc, precision, recall, f1 0.0055 0.8643 0.5168 0.7208 0.602 ------------------------------ ------------------------------ current training loss is : 0.009529965929687023 ............................................................ DEV: loss, acc, precision, recall, f1 0.0049 0.6399 0.358 0.7643 0.4876 ------------------------------ ------------------------------ current training loss is : 0.037755656987428665 ............................................................ DEV: loss, acc, precision, recall, f1 0.0054 0.3162 0.2194 0.7333 0.3378 ------------------------------ ------------------------------ current training loss is : 1.558396252221428e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0055 0.2986 0.143 0.6843 0.2365 ------------------------------ ------------------------------ current training loss is : 0.00695763947442174 ............................................................ DEV: loss, acc, precision, recall, f1 0.0056 0.3024 0.1639 0.7354 0.2681 ------------------------------ ------------------------------ current training loss is : 0.0003703889960888773 ............................................................ DEV: loss, acc, precision, recall, f1 0.0056 0.5392 0.3889 0.8396 0.5316 ------------------------------ ------------------------------ current training loss is : 0.0027276326436549425 ............................................................ DEV: loss, acc, precision, recall, f1 0.0053 0.9269 0.7145 0.7249 0.7197 ------------------------------ ------------------------------ current training loss is : 0.0023270882666110992 ............................................................ DEV: loss, acc, precision, recall, f1 0.0053 0.6187 0.3639 0.7835 0.497 ------------------------------ ------------------------------ current training loss is : 0.008502095006406307 ............................................................ DEV: loss, acc, precision, recall, f1 0.0053 0.6089 0.3219 0.7505 0.4505 ------------------------------ ------------------------------ current training loss is : 0.005848872475326061 ............................................................ DEV: loss, acc, precision, recall, f1 0.0045 0.9179 0.6696 0.7161 0.6921 ------------------------------ ###################################################################### EPOCH: 2 current learning rate 9.025e-06 current learning rate 9.025e-06 ------------------------------ current training loss is : 0.001259281998500228 ............................................................ DEV: loss, acc, precision, recall, f1 0.0052 0.9184 0.6877 0.7725 0.7276 ------------------------------ ------------------------------ current training loss is : 0.002061256906017661 ............................................................ DEV: loss, acc, precision, recall, f1 0.0054 0.8392 0.5804 0.7727 0.6629 ------------------------------ ------------------------------ current training loss is : 6.292638863669708e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0059 0.5464 0.2237 0.756 0.3453 ------------------------------ ------------------------------ current training loss is : 0.0013445813674479723 ............................................................ DEV: loss, acc, precision, recall, f1 0.0052 0.2943 0.1882 0.7505 0.301 ------------------------------ ------------------------------ current training loss is : 0.0027943423483520746 ............................................................ DEV: loss, acc, precision, recall, f1 0.0058 0.5569 0.2563 0.7471 0.3817 ------------------------------ ------------------------------ current training loss is : 0.00506820622831583 ............................................................ DEV: loss, acc, precision, recall, f1 0.0058 0.6108 0.3441 0.7569 0.4731 ------------------------------ ------------------------------ current training loss is : 1.0660241059667896e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0054 0.3354 0.1637 0.7519 0.2688 ------------------------------ ------------------------------ current training loss is : 0.007756161503493786 ............................................................ DEV: loss, acc, precision, recall, f1 0.0061 0.3036 0.1493 0.7576 0.2494 ------------------------------ ------------------------------ current training loss is : 0.00031303646392188966 ............................................................ DEV: loss, acc, precision, recall, f1 0.0066 0.3008 0.1977 0.7581 0.3136 ------------------------------ ------------------------------ current training loss is : 0.0014186813496053219 ............................................................ DEV: loss, acc, precision, recall, f1 0.0058 0.6454 0.3462 0.7146 0.4665 ------------------------------ ------------------------------ current training loss is : 0.0007883103098720312 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.3254 0.2393 0.7677 0.3649 ------------------------------ ------------------------------ current training loss is : 0.005411483347415924 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.2989 0.1608 0.7586 0.2653 ------------------------------ ------------------------------ current training loss is : 0.005193878896534443 ............................................................ DEV: loss, acc, precision, recall, f1 0.0056 0.8268 0.4942 0.7335 0.5905 ------------------------------ ###################################################################### EPOCH: 3 current learning rate 8.57375e-06 current learning rate 8.57375e-06 ------------------------------ current training loss is : 0.000553997466340661 ............................................................ DEV: loss, acc, precision, recall, f1 0.0068 0.911 0.6529 0.7732 0.708 ------------------------------ ------------------------------ current training loss is : 0.001516917021945119 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.5209 0.2837 0.7715 0.4149 ------------------------------ ------------------------------ current training loss is : 0.00020100202527828515 ............................................................ DEV: loss, acc, precision, recall, f1 0.0065 0.9019 0.5997 0.7622 0.6712 ------------------------------ ------------------------------ current training loss is : 0.0011648930376395583 ............................................................ DEV: loss, acc, precision, recall, f1 0.0064 0.3306 0.2075 0.7545 0.3255 ------------------------------ ------------------------------ current training loss is : 0.001798806944862008 ............................................................ DEV: loss, acc, precision, recall, f1 0.0058 0.9034 0.5807 0.7588 0.6579 ------------------------------ ------------------------------ current training loss is : 0.005710647441446781 ............................................................ DEV: loss, acc, precision, recall, f1 0.0067 0.8909 0.6289 0.7689 0.6919 ------------------------------ ------------------------------ current training loss is : 5.536737262445968e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0065 0.3039 0.1685 0.7729 0.2766 ------------------------------ ------------------------------ current training loss is : 0.0022274518851190805 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.2974 0.0616 0.7545 0.1139 ------------------------------ ------------------------------ current training loss is : 0.00029264131444506347 ............................................................ DEV: loss, acc, precision, recall, f1 0.0072 0.5927 0.3197 0.7505 0.4484 ------------------------------ ------------------------------ current training loss is : 0.0015814974904060364 ............................................................ DEV: loss, acc, precision, recall, f1 0.0061 0.6329 0.3492 0.7309 0.4726 ------------------------------ ------------------------------ current training loss is : 0.00047273910604417324 ............................................................ DEV: loss, acc, precision, recall, f1 0.0069 0.2947 0.2014 0.7605 0.3184 ------------------------------ ------------------------------ current training loss is : 0.0003899783769156784 ............................................................ DEV: loss, acc, precision, recall, f1 0.0066 0.2966 0.1504 0.7211 0.2489 ------------------------------ ------------------------------ current training loss is : 0.0013236759696155787 ............................................................ DEV: loss, acc, precision, recall, f1 0.0067 0.301 0.2294 0.7519 0.3516 ------------------------------ ###################################################################### EPOCH: 4 current learning rate 8.1450625e-06 current learning rate 8.1450625e-06 ------------------------------ current training loss is : 0.0018008921761065722 ............................................................ DEV: loss, acc, precision, recall, f1 0.0069 0.299 0.1153 0.7593 0.2002 ------------------------------ ------------------------------ current training loss is : 0.0001561229582875967 ............................................................ DEV: loss, acc, precision, recall, f1 0.0063 0.3044 0.157 0.7744 0.2611 ------------------------------ ------------------------------ current training loss is : 8.328224794240668e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0065 0.304 0.1429 0.7522 0.2402 ------------------------------ ------------------------------ current training loss is : 0.0010962827363982797 ............................................................ DEV: loss, acc, precision, recall, f1 0.0083 0.2781 0.0618 0.7655 0.1143 ------------------------------ ------------------------------ current training loss is : 0.0009025464532896876 ............................................................ DEV: loss, acc, precision, recall, f1 0.0067 0.3799 0.1656 0.7627 0.2721 ------------------------------ ------------------------------ current training loss is : 0.0004875045269727707 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.6288 0.3303 0.7859 0.4651 ------------------------------ ------------------------------ current training loss is : 4.330176125222351e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0073 0.6018 0.2063 0.7761 0.326 ------------------------------ ------------------------------ current training loss is : 0.0028885039500892162 ............................................................ DEV: loss, acc, precision, recall, f1 0.0071 0.327 0.1232 0.7887 0.213 ------------------------------ ------------------------------ current training loss is : 0.0003202503430657089 ............................................................ DEV: loss, acc, precision, recall, f1 0.0072 0.5797 0.2949 0.7749 0.4272 ------------------------------ ------------------------------ current training loss is : 0.0023023374378681183 ............................................................ DEV: loss, acc, precision, recall, f1 0.0064 0.3974 0.1704 0.7689 0.279 ------------------------------ ------------------------------ current training loss is : 0.0012561487965285778 ............................................................ DEV: loss, acc, precision, recall, f1 0.0064 0.3025 0.1853 0.7725 0.2989 ------------------------------ ------------------------------ current training loss is : 5.9736816183431074e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0069 0.2973 0.0942 0.7715 0.1679 ------------------------------ ------------------------------ current training loss is : 0.0010490657296031713 ............................................................ DEV: loss, acc, precision, recall, f1 0.0077 0.2972 0.1339 0.7677 0.2281 ------------------------------ ###################################################################### EPOCH: 5 current learning rate 7.737809375e-06 current learning rate 7.737809375e-06 ------------------------------ current training loss is : 0.0007653237553313375 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.3029 0.1407 0.7827 0.2385 ------------------------------ ------------------------------ current training loss is : 0.005620667710900307 ............................................................ DEV: loss, acc, precision, recall, f1 0.0072 0.3056 0.1523 0.7983 0.2559 ------------------------------ ------------------------------ current training loss is : 6.944198048586259e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0082 0.5923 0.2119 0.7772 0.333 ------------------------------ ------------------------------ current training loss is : 0.001078870496712625 ............................................................ DEV: loss, acc, precision, recall, f1 0.0069 0.5109 0.1174 0.8313 0.2057 ------------------------------ ------------------------------ current training loss is : 0.0005673955893144011 ............................................................ DEV: loss, acc, precision, recall, f1 0.0077 0.3115 0.2161 0.8172 0.3419 ------------------------------ ------------------------------ current training loss is : 0.0051736971363425255 ............................................................ DEV: loss, acc, precision, recall, f1 0.0073 0.7691 0.4691 0.8136 0.5951 ------------------------------ ------------------------------ current training loss is : 3.1025990665511927e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0083 0.2989 0.1682 0.8021 0.2781 ------------------------------ ------------------------------ current training loss is : 0.0005125850439071655 ............................................................ DEV: loss, acc, precision, recall, f1 0.0083 0.3001 0.1163 0.8315 0.2041 ------------------------------ ------------------------------ current training loss is : 0.00023488891019951552 ............................................................ DEV: loss, acc, precision, recall, f1 0.0069 0.2962 0.1543 0.8368 0.2606 ------------------------------ ------------------------------ current training loss is : 0.0006917217979207635 ............................................................ DEV: loss, acc, precision, recall, f1 0.0073 0.4335 0.1804 0.826 0.2962 ------------------------------ ------------------------------ current training loss is : 0.00034669646993279457 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.293 0.1014 0.8327 0.1808 ------------------------------ ------------------------------ current training loss is : 0.0002028668241109699 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.298 0.0643 0.8243 0.1194 ------------------------------ ------------------------------ current training loss is : 0.0009247356210835278 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.2994 0.1389 0.8305 0.2379 ------------------------------ ###################################################################### EPOCH: 6 current learning rate 7.35091890625e-06 current learning rate 7.35091890625e-06 ------------------------------ current training loss is : 0.00019250727200414985 ............................................................ DEV: loss, acc, precision, recall, f1 0.009 0.3032 0.1612 0.8363 0.2703 ------------------------------ ------------------------------ current training loss is : 5.059465183876455e-05 ............................................................ DEV: loss, acc, precision, recall, f1 0.0077 0.2978 0.1544 0.8473 0.2612 ------------------------------ ------------------------------ current training loss is : 5.413561666500755e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0079 0.296 0.1279 0.8518 0.2225 ------------------------------ ------------------------------ current training loss is : 0.0004280754073988646 ............................................................ DEV: loss, acc, precision, recall, f1 0.0078 0.2963 0.0968 0.869 0.1742 ------------------------------ ------------------------------ current training loss is : 0.0005110411439090967 ............................................................ DEV: loss, acc, precision, recall, f1 0.0086 0.2959 0.1405 0.8707 0.2419 ------------------------------ ------------------------------ current training loss is : 0.00028553747688420117 ............................................................ DEV: loss, acc, precision, recall, f1 0.0079 0.298 0.1965 0.8683 0.3205 ------------------------------ ------------------------------ current training loss is : 2.552310661485535e-06 ............................................................ DEV: loss, acc, precision, recall, f1 0.0084 0.2953 0.0978 0.8877 0.1761 ------------------------------ -----------------------------*- current training loss is : 0.0005301011842675507 越跑,f1越低,是哪里出问题了吗,数据是从提供的链接下载的。

ghost commented 3 years ago

您好,感谢提问! 麻烦请您参考我们在中文MSRA的参数配置

非常感谢!