jiesutd / LatticeLSTM

Chinese NER using Lattice LSTM. Code for ACL 2018 paper.
1.79k stars 457 forks source link

您好,关于跑新的数据集的问题 #90

Closed Biubiulity closed 4 years ago

Biubiulity commented 4 years ago

D:\Anaconda3\python.exe G:/experiment/main.py CuDNN: True GPU available: False Status: train Seg: True Train file: ./data/onto4ner.cn/weiboNER_2nd_conll.train Dev file: ./data/onto4ner.cn/weiboNER_2nd_conll.dev Test file: ./data/onto4ner.cn/weiboNER_2nd_conll.test Raw file: ./data/onto4ner.cn/weiboNER_2nd_conll.test Char emb: data/gigaword_chn.all.a2b.uni.ite50.vec Bichar emb: None Gaz file: data/ctb.50d.vec Model saved to: ./data/onto4ner.cn/saved_model Load gaz file: data/ctb.50d.vec total size: 704368 gaz alphabet size: 2 gaz alphabet size: 2 gaz alphabet size: 2 build word pretrain emb... Embedding: pretrain word:11327, prefect match:0, case_match:0, oov:3387, oov%:0.9997048406139315 build biword pretrain emb... Embedding: pretrain word:0, prefect match:0, case_match:0, oov:45491, oov%:0.9999780181130749 build gaz pretrain emb... Embedding: pretrain word:704368, prefect match:0, case_match:0, oov:1, oov%:0.5 Training model... DATA SUMMARY START: Tag scheme: BIO MAX SENTENCE LENGTH: 250 MAX WORD LENGTH: -1 Number normalized: True Use bigram: False Word alphabet size: 3388 Biword alphabet size: 45492 Char alphabet size: 3357 Gaz alphabet size: 2 Label alphabet size: 18 Word embedding size: 50 Biword embedding size: 50 Char embedding size: 50 Gaz embedding size: 50 Norm word emb: True Norm biword emb: True Norm gaz emb: False Norm gaz dropout: 0.5 Train instance number: 1350 Dev instance number: 270 Test instance number: 270 Raw instance number: 0 Hyperpara iteration: 100 Hyperpara batch size: 1 Hyperpara lr: 0.015 Hyperpara lr_decay: 0.05 Hyperpara HP_clip: 5.0 Hyperpara momentum: 0 Hyperpara hidden_dim: 200 Hyperpara dropout: 0.5 Hyperpara lstm_layer: 1 Hyperpara bilstm: True Hyperpara GPU: False Hyperpara use_gaz: True Hyperpara fix gaz emb: False Hyperpara use_char: False DATA SUMMARY END. Data setting saved to file: ./data/onto4ner.cn/saved_model.dset build batched lstmcrf... build batched bilstm... build LatticeLSTM... forward , Fix emb: False gaz drop: 0.5 load pretrain word emb... (2, 50) build LatticeLSTM... backward , Fix emb: False gaz drop: 0.5 load pretrain word emb... (2, 50) build batched crf... finished built model. Epoch: 0/100 Learning rate is setted as: 0.015 Instance: 500; Time: 43.23s; loss: 8353.7121; acc: 25399/27265=0.9316 Instance: 1000; Time: 36.27s; loss: 6507.9950; acc: 50956/54737=0.9309 Instance: 1350; Time: 23.01s; loss: 3835.8206; acc: 68546/73778=0.9291 Epoch: 0 training finished. Time: 102.51s, speed: 13.17st/s, total loss: 18697.527671813965 gold_num = 389 pred_num = 7 right_num = 0 Dev: time: 5.64s, speed: 48.13st/s; acc: 0.9330, p: 0.0000, r: 0.0000, f: -1.0000 gold_num = 414 pred_num = 2 right_num = 1 Test: time: 6.57s, speed: 41.16st/s; acc: 0.9274, p: 0.5000, r: 0.0024, f: 0.0048 Epoch: 1/100 Learning rate is setted as: 0.014249999999999999 Instance: 500; Time: 32.57s; loss: 4907.2505; acc: 25699/27435=0.9367 Instance: 1000; Time: 31.39s; loss: 4013.4990; acc: 50852/54336=0.9359 Instance: 1350; Time: 22.50s; loss: 2766.0821; acc: 69083/73778=0.9364 Epoch: 1 training finished. Time: 86.46s, speed: 15.61st/s, total loss: 11686.831638336182 gold_num = 389 pred_num = 144 right_num = 93 Dev: time: 5.27s, speed: 51.43st/s; acc: 0.9453, p: 0.6458, r: 0.2391, f: 0.3490 Exceed previous best f score: -1 gold_num = 414 pred_num = 107 right_num = 56 Test: time: 5.30s, speed: 51.53st/s; acc: 0.9361, p: 0.5234, r: 0.1353, f: 0.2150 Epoch: 2/100 Learning rate is setted as: 0.0135375 Instance: 500; Time: 31.27s; loss: 3183.9019; acc: 25468/26936=0.9455 Instance: 1000; Time: 32.47s; loss: 3497.6632; acc: 51754/55066=0.9399 Instance: 1350; Time: 22.58s; loss: 2232.6319; acc: 69466/73778=0.9416 Epoch: 2 training finished. Time: 86.32s, speed: 15.64st/s, total loss: 8914.197044372559 gold_num = 389 pred_num = 173 right_num = 106 Dev: time: 5.64s, speed: 48.05st/s; acc: 0.9482, p: 0.6127, r: 0.2725, f: 0.3772 Exceed previous best f score: 0.34896810506566606 gold_num = 414 pred_num = 123 right_num = 78 Test: time: 5.92s, speed: 46.38st/s; acc: 0.9402, p: 0.6341, r: 0.1884, f: 0.2905 Epoch: 3/100 Learning rate is setted as: 0.012860624999999997 Instance: 500; Time: 32.10s; loss: 2750.2932; acc: 25878/27331=0.9468 Instance: 1000; Time: 32.04s; loss: 2797.1717; acc: 51392/54325=0.9460 Instance: 1350; Time: 24.10s; loss: 1805.7985; acc: 69830/73778=0.9465 Epoch: 3 training finished. Time: 88.24s, speed: 15.30st/s, total loss: 7353.263328552246 gold_num = 389 pred_num = 140 right_num = 89 Dev: time: 5.60s, speed: 48.35st/s; acc: 0.9478, p: 0.6357, r: 0.2288, f: 0.3365 gold_num = 414 pred_num = 87 right_num = 62 Test: time: 6.24s, speed: 43.38st/s; acc: 0.9375, p: 0.7126, r: 0.1498, f: 0.2475 Epoch: 4/100 Learning rate is setted as: 0.012217593749999998 Instance: 500; Time: 38.97s; loss: 2368.9777; acc: 27038/28385=0.9525 Instance: 1000; Time: 35.29s; loss: 2310.6736; acc: 51716/54387=0.9509 Instance: 1350; Time: 25.06s; loss: 1737.6990; acc: 70130/73778=0.9506 Epoch: 4 training finished. Time: 99.33s, speed: 13.59st/s, total loss: 6417.350269317627 gold_num = 389 pred_num = 242 right_num = 139 Dev: time: 5.80s, speed: 46.66st/s; acc: 0.9506, p: 0.5744, r: 0.3573, f: 0.4406 Exceed previous best f score: 0.3772241992882562 gold_num = 414 pred_num = 205 right_num = 120 Test: time: 6.18s, speed: 44.02st/s; acc: 0.9450, p: 0.5854, r: 0.2899, f: 0.3877 Epoch: 5/100 Learning rate is setted as: 0.011606714062499995 Instance: 500; Time: 37.95s; loss: 1882.3516; acc: 25622/26726=0.9587 Instance: 1000; Time: 35.38s; loss: 2224.0402; acc: 51347/53790=0.9546 Instance: 1350; Time: 23.87s; loss: 1504.2289; acc: 70438/73778=0.9547 Epoch: 5 training finished. Time: 97.19s, speed: 13.89st/s, total loss: 5610.620803833008 gold_num = 389 pred_num = 254 right_num = 146 Dev: time: 5.90s, speed: 45.86st/s; acc: 0.9515, p: 0.5748, r: 0.3753, f: 0.4541 Exceed previous best f score: 0.44057052297939775 gold_num = 414 pred_num = 229 right_num = 140 Test: time: 6.11s, speed: 44.64st/s; acc: 0.9471, p: 0.6114, r: 0.3382, f: 0.4355 Epoch: 6/100 Learning rate is setted as: 0.011026378359374997 Instance: 500; Time: 32.10s; loss: 1676.2667; acc: 25272/26259=0.9624 Instance: 1000; Time: 32.24s; loss: 1810.9099; acc: 51960/54019=0.9619 Instance: 1350; Time: 23.36s; loss: 1524.6076; acc: 70781/73778=0.9594 Epoch: 6 training finished. Time: 87.70s, speed: 15.39st/s, total loss: 5011.784099578857 gold_num = 389 pred_num = 255 right_num = 161 Dev: time: 5.46s, speed: 49.55st/s; acc: 0.9553, p: 0.6314, r: 0.4139, f: 0.5000 Exceed previous best f score: 0.4541213063763608 gold_num = 414 pred_num = 225 right_num = 148 Test: time: 6.74s, speed: 47.90st/s; acc: 0.9486, p: 0.6578, r: 0.3575, f: 0.4632 Epoch: 7/100 Learning rate is setted as: 0.010475059441406245 Instance: 500; Time: 32.21s; loss: 1650.2751; acc: 26363/27373=0.9631 Instance: 1000; Time: 32.79s; loss: 1681.9573; acc: 52655/54691=0.9628 Instance: 1350; Time: 22.37s; loss: 1277.5512; acc: 70933/73778=0.9614 Epoch: 7 training finished. Time: 87.37s, speed: 15.45st/s, total loss: 4609.783710479736 gold_num = 389 pred_num = 301 right_num = 164 Dev: time: 5.38s, speed: 50.37st/s; acc: 0.9515, p: 0.5449, r: 0.4216, f: 0.4754 gold_num = 414 pred_num = 324 right_num = 162 Test: time: 5.61s, speed: 48.31st/s; acc: 0.9444, p: 0.5000, r: 0.3913, f: 0.4390 Epoch: 8/100 Learning rate is setted as: 0.009951306469335933 Instance: 500; Time: 33.81s; loss: 1308.9993; acc: 26565/27332=0.9719 Instance: 1000; Time: 34.60s; loss: 1639.5016; acc: 52881/54695=0.9668 Instance: 1350; Time: 23.46s; loss: 1182.6076; acc: 71262/73778=0.9659 Epoch: 8 training finished. Time: 91.87s, speed: 14.70st/s, total loss: 4131.108478546143 gold_num = 389 pred_num = 314 right_num = 180 Dev: time: 5.28s, speed: 51.31st/s; acc: 0.9540, p: 0.5732, r: 0.4627, f: 0.5121 Exceed previous best f score: 0.5 gold_num = 414 pred_num = 315 right_num = 172 Test: time: 5.49s, speed: 49.49st/s; acc: 0.9481, p: 0.5460, r: 0.4155, f: 0.4719 Epoch: 9/100 Learning rate is setted as: 0.009453741145869136 Instance: 500; Time: 33.62s; loss: 1346.7611; acc: 26282/27151=0.9680 Instance: 1000; Time: 34.71s; loss: 1505.3554; acc: 53071/54864=0.9673 Instance: 1350; Time: 23.36s; loss: 976.1559; acc: 71384/73778=0.9676 Epoch: 9 training finished. Time: 91.69s, speed: 14.72st/s, total loss: 3828.2723808288574 gold_num = 389 pred_num = 304 right_num = 174 Dev: time: 5.52s, speed: 49.01st/s; acc: 0.9526, p: 0.5724, r: 0.4473, f: 0.5022 gold_num = 414 pred_num = 297 right_num = 161 Test: time: 5.78s, speed: 46.87st/s; acc: 0.9460, p: 0.5421, r: 0.3889, f: 0.4529 以上是我跑了微博相关数据集的部分代码结果,其中我想问一下日志文件中gaz alphabet size: 为什么一直是2呢?

jiesutd commented 4 years ago

看起来你这个是因为你的输入格式有问题,word embedding 和 gaz 无法匹配。请检查你的输入数据的格式是不是

字 label 字 label 

这种格式。

我猜测可能你的输入格式是 字position label

mingxixixi commented 4 years ago

D:\Anaconda3\python.exe G:/experiment/main.py CuDNN: True GPU available: False Status: train Seg: True Train file: ./data/onto4ner.cn/weiboNER_2nd_conll.train Dev file: ./data/onto4ner.cn/weiboNER_2nd_conll.dev Test file: ./data/onto4ner.cn/weiboNER_2nd_conll.test Raw file: ./data/onto4ner.cn/weiboNER_2nd_conll.test Char emb: data/gigaword_chn.all.a2b.uni.ite50.vec Bichar emb: None Gaz file: data/ctb.50d.vec Model saved to: ./data/onto4ner.cn/saved_model Load gaz file: data/ctb.50d.vec total size: 704368 gaz alphabet size: 2 gaz alphabet size: 2 gaz alphabet size: 2 build word pretrain emb... Embedding: pretrain word:11327, prefect match:0, case_match:0, oov:3387, oov%:0.9997048406139315 build biword pretrain emb... Embedding: pretrain word:0, prefect match:0, case_match:0, oov:45491, oov%:0.9999780181130749 build gaz pretrain emb... Embedding: pretrain word:704368, prefect match:0, case_match:0, oov:1, oov%:0.5 Training model... DATA SUMMARY START: Tag scheme: BIO MAX SENTENCE LENGTH: 250 MAX WORD LENGTH: -1 Number normalized: True Use bigram: False Word alphabet size: 3388 Biword alphabet size: 45492 Char alphabet size: 3357 Gaz alphabet size: 2 Label alphabet size: 18 Word embedding size: 50 Biword embedding size: 50 Char embedding size: 50 Gaz embedding size: 50 Norm word emb: True Norm biword emb: True Norm gaz emb: False Norm gaz dropout: 0.5 Train instance number: 1350 Dev instance number: 270 Test instance number: 270 Raw instance number: 0 Hyperpara iteration: 100 Hyperpara batch size: 1 Hyperpara lr: 0.015 Hyperpara lr_decay: 0.05 Hyperpara HP_clip: 5.0 Hyperpara momentum: 0 Hyperpara hidden_dim: 200 Hyperpara dropout: 0.5 Hyperpara lstm_layer: 1 Hyperpara bilstm: True Hyperpara GPU: False Hyperpara use_gaz: True Hyperpara fix gaz emb: False Hyperpara use_char: False DATA SUMMARY END. Data setting saved to file: ./data/onto4ner.cn/saved_model.dset build batched lstmcrf... build batched bilstm... build LatticeLSTM... forward , Fix emb: False gaz drop: 0.5 load pretrain word emb... (2, 50) build LatticeLSTM... backward , Fix emb: False gaz drop: 0.5 load pretrain word emb... (2, 50) build batched crf... finished built model. Epoch: 0/100 Learning rate is setted as: 0.015 Instance: 500; Time: 43.23s; loss: 8353.7121; acc: 25399/27265=0.9316 Instance: 1000; Time: 36.27s; loss: 6507.9950; acc: 50956/54737=0.9309 Instance: 1350; Time: 23.01s; loss: 3835.8206; acc: 68546/73778=0.9291 Epoch: 0 training finished. Time: 102.51s, speed: 13.17st/s, total loss: 18697.527671813965 gold_num = 389 pred_num = 7 right_num = 0 Dev: time: 5.64s, speed: 48.13st/s; acc: 0.9330, p: 0.0000, r: 0.0000, f: -1.0000 gold_num = 414 pred_num = 2 right_num = 1 Test: time: 6.57s, speed: 41.16st/s; acc: 0.9274, p: 0.5000, r: 0.0024, f: 0.0048 Epoch: 1/100 Learning rate is setted as: 0.014249999999999999 Instance: 500; Time: 32.57s; loss: 4907.2505; acc: 25699/27435=0.9367 Instance: 1000; Time: 31.39s; loss: 4013.4990; acc: 50852/54336=0.9359 Instance: 1350; Time: 22.50s; loss: 2766.0821; acc: 69083/73778=0.9364 Epoch: 1 training finished. Time: 86.46s, speed: 15.61st/s, total loss: 11686.831638336182 gold_num = 389 pred_num = 144 right_num = 93 Dev: time: 5.27s, speed: 51.43st/s; acc: 0.9453, p: 0.6458, r: 0.2391, f: 0.3490 Exceed previous best f score: -1 gold_num = 414 pred_num = 107 right_num = 56 Test: time: 5.30s, speed: 51.53st/s; acc: 0.9361, p: 0.5234, r: 0.1353, f: 0.2150 Epoch: 2/100 Learning rate is setted as: 0.0135375 Instance: 500; Time: 31.27s; loss: 3183.9019; acc: 25468/26936=0.9455 Instance: 1000; Time: 32.47s; loss: 3497.6632; acc: 51754/55066=0.9399 Instance: 1350; Time: 22.58s; loss: 2232.6319; acc: 69466/73778=0.9416 Epoch: 2 training finished. Time: 86.32s, speed: 15.64st/s, total loss: 8914.197044372559 gold_num = 389 pred_num = 173 right_num = 106 Dev: time: 5.64s, speed: 48.05st/s; acc: 0.9482, p: 0.6127, r: 0.2725, f: 0.3772 Exceed previous best f score: 0.34896810506566606 gold_num = 414 pred_num = 123 right_num = 78 Test: time: 5.92s, speed: 46.38st/s; acc: 0.9402, p: 0.6341, r: 0.1884, f: 0.2905 Epoch: 3/100 Learning rate is setted as: 0.012860624999999997 Instance: 500; Time: 32.10s; loss: 2750.2932; acc: 25878/27331=0.9468 Instance: 1000; Time: 32.04s; loss: 2797.1717; acc: 51392/54325=0.9460 Instance: 1350; Time: 24.10s; loss: 1805.7985; acc: 69830/73778=0.9465 Epoch: 3 training finished. Time: 88.24s, speed: 15.30st/s, total loss: 7353.263328552246 gold_num = 389 pred_num = 140 right_num = 89 Dev: time: 5.60s, speed: 48.35st/s; acc: 0.9478, p: 0.6357, r: 0.2288, f: 0.3365 gold_num = 414 pred_num = 87 right_num = 62 Test: time: 6.24s, speed: 43.38st/s; acc: 0.9375, p: 0.7126, r: 0.1498, f: 0.2475 Epoch: 4/100 Learning rate is setted as: 0.012217593749999998 Instance: 500; Time: 38.97s; loss: 2368.9777; acc: 27038/28385=0.9525 Instance: 1000; Time: 35.29s; loss: 2310.6736; acc: 51716/54387=0.9509 Instance: 1350; Time: 25.06s; loss: 1737.6990; acc: 70130/73778=0.9506 Epoch: 4 training finished. Time: 99.33s, speed: 13.59st/s, total loss: 6417.350269317627 gold_num = 389 pred_num = 242 right_num = 139 Dev: time: 5.80s, speed: 46.66st/s; acc: 0.9506, p: 0.5744, r: 0.3573, f: 0.4406 Exceed previous best f score: 0.3772241992882562 gold_num = 414 pred_num = 205 right_num = 120 Test: time: 6.18s, speed: 44.02st/s; acc: 0.9450, p: 0.5854, r: 0.2899, f: 0.3877 Epoch: 5/100 Learning rate is setted as: 0.011606714062499995 Instance: 500; Time: 37.95s; loss: 1882.3516; acc: 25622/26726=0.9587 Instance: 1000; Time: 35.38s; loss: 2224.0402; acc: 51347/53790=0.9546 Instance: 1350; Time: 23.87s; loss: 1504.2289; acc: 70438/73778=0.9547 Epoch: 5 training finished. Time: 97.19s, speed: 13.89st/s, total loss: 5610.620803833008 gold_num = 389 pred_num = 254 right_num = 146 Dev: time: 5.90s, speed: 45.86st/s; acc: 0.9515, p: 0.5748, r: 0.3753, f: 0.4541 Exceed previous best f score: 0.44057052297939775 gold_num = 414 pred_num = 229 right_num = 140 Test: time: 6.11s, speed: 44.64st/s; acc: 0.9471, p: 0.6114, r: 0.3382, f: 0.4355 Epoch: 6/100 Learning rate is setted as: 0.011026378359374997 Instance: 500; Time: 32.10s; loss: 1676.2667; acc: 25272/26259=0.9624 Instance: 1000; Time: 32.24s; loss: 1810.9099; acc: 51960/54019=0.9619 Instance: 1350; Time: 23.36s; loss: 1524.6076; acc: 70781/73778=0.9594 Epoch: 6 training finished. Time: 87.70s, speed: 15.39st/s, total loss: 5011.784099578857 gold_num = 389 pred_num = 255 right_num = 161 Dev: time: 5.46s, speed: 49.55st/s; acc: 0.9553, p: 0.6314, r: 0.4139, f: 0.5000 Exceed previous best f score: 0.4541213063763608 gold_num = 414 pred_num = 225 right_num = 148 Test: time: 6.74s, speed: 47.90st/s; acc: 0.9486, p: 0.6578, r: 0.3575, f: 0.4632 Epoch: 7/100 Learning rate is setted as: 0.010475059441406245 Instance: 500; Time: 32.21s; loss: 1650.2751; acc: 26363/27373=0.9631 Instance: 1000; Time: 32.79s; loss: 1681.9573; acc: 52655/54691=0.9628 Instance: 1350; Time: 22.37s; loss: 1277.5512; acc: 70933/73778=0.9614 Epoch: 7 training finished. Time: 87.37s, speed: 15.45st/s, total loss: 4609.783710479736 gold_num = 389 pred_num = 301 right_num = 164 Dev: time: 5.38s, speed: 50.37st/s; acc: 0.9515, p: 0.5449, r: 0.4216, f: 0.4754 gold_num = 414 pred_num = 324 right_num = 162 Test: time: 5.61s, speed: 48.31st/s; acc: 0.9444, p: 0.5000, r: 0.3913, f: 0.4390 Epoch: 8/100 Learning rate is setted as: 0.009951306469335933 Instance: 500; Time: 33.81s; loss: 1308.9993; acc: 26565/27332=0.9719 Instance: 1000; Time: 34.60s; loss: 1639.5016; acc: 52881/54695=0.9668 Instance: 1350; Time: 23.46s; loss: 1182.6076; acc: 71262/73778=0.9659 Epoch: 8 training finished. Time: 91.87s, speed: 14.70st/s, total loss: 4131.108478546143 gold_num = 389 pred_num = 314 right_num = 180 Dev: time: 5.28s, speed: 51.31st/s; acc: 0.9540, p: 0.5732, r: 0.4627, f: 0.5121 Exceed previous best f score: 0.5 gold_num = 414 pred_num = 315 right_num = 172 Test: time: 5.49s, speed: 49.49st/s; acc: 0.9481, p: 0.5460, r: 0.4155, f: 0.4719 Epoch: 9/100 Learning rate is setted as: 0.009453741145869136 Instance: 500; Time: 33.62s; loss: 1346.7611; acc: 26282/27151=0.9680 Instance: 1000; Time: 34.71s; loss: 1505.3554; acc: 53071/54864=0.9673 Instance: 1350; Time: 23.36s; loss: 976.1559; acc: 71384/73778=0.9676 Epoch: 9 training finished. Time: 91.69s, speed: 14.72st/s, total loss: 3828.2723808288574 gold_num = 389 pred_num = 304 right_num = 174 Dev: time: 5.52s, speed: 49.01st/s; acc: 0.9526, p: 0.5724, r: 0.4473, f: 0.5022 gold_num = 414 pred_num = 297 right_num = 161 Test: time: 5.78s, speed: 46.87st/s; acc: 0.9460, p: 0.5421, r: 0.3889, f: 0.4529 以上是我跑了微博相关数据集的部分代码结果,其中我想问一下日志文件中gaz alphabet size: 为什么一直是2呢?

你好,作者的数据集你有跑通吗,我一直有这个错? RuntimeError: set_storage is not allowed on a Tensor created from .data or .detach(). If your intent is to change the metadata of a Tensor (such as sizes / strides / storage / storage_offset) without autograd tracking the change, remove the .data / .detach() call and wrap the change in a with torch.no_grad(): block. For example, change: x.data.set_(y) to: with torch.nograd(): x.set(y)