Closed leon2milan closed 3 years ago
你这训练数据不对吧,训练集怎么才199个batch
hi,我尝试了macbert,使用SIGHAN数据集训练的,没有改参数,也无法复现效果,请问我有哪里做错了吗?我使用SIGHAN数据集训练,训练集只有235个batch
DATALOADER:0 TEST RESULTS {'val_loss': 0.0411618158855624}
我是加了wang271k用于训练的,sighan这个数据量太小了
感谢回复!我也试试
I fellow the steps. And get different result.
Epoch 9: 100%|█████████████████████████████████████████████████████████████████████████| 199/199 [00:55<00:00, 3.56it/s, loss=0.103, v_num=1] /home/dell/workspace/jiangbingyu/correction/checkpoints/SoftMaskedBert/epoch=09-val_loss=0.13123.ckpt
Testing: 0it [00:00, ?it/s]2021-09-08 23:47:58,342 SoftMaskedBertModel INFO: Testing... Testing: 97%|█████████████████████████████████████████████████████████████████████████████████████████████▏ | 67/69 [00:03<00:00, 18.43it/s] 2021-09-08 23:48:02,103 SoftMaskedBertModel INFO: Test. 2021-09-08 23:48:02,105 SoftMaskedBertModel INFO: loss: 0.08779423662285873 2021-09-08 23:48:02,105 SoftMaskedBertModel INFO: Detection: acc: 0.5000 2021-09-08 23:48:02,106 SoftMaskedBertModel INFO: Correction: acc: 0.6900 2021-09-08 23:48:02,114 SoftMaskedBertModel INFO: The detection result is precision=0.8228782287822878, recall=0.6308345120226309 and F1=0.7141713370696557 2021-09-08 23:48:02,115 SoftMaskedBertModel INFO: The correction result is precision=0.7399103139013453, recall=0.6534653465346535 and F1=0.694006309148265 2021-09-08 23:48:02,116 SoftMaskedBertModel INFO: Sentence Level: acc:0.690000, precision:0.829508, recall:0.466790, f1:0.597403 Testing: 100%|████████████████████████████████████████████████████████████████████████████████████████████████| 69/69 [00:03<00:00, 18.27it/s]
DATALOADER:0 TEST RESULTS {'val_loss': 0.08779423662285873}