2023-05-26 01:39:08 - INFO: dict_items([('dataset', 'resume-zh'), ('dist_emb_size', 20), ('type_emb_size', 20), ('lstm_hid_size', 512), ('conv_hid_size', 96), ('bert_hid_size', 768), ('biaffine_size', 512), ('ffnn_hid_size', 288), ('dilation', [1, 2, 3]), ('emb_dropout', 0.5), ('conv_dropout', 0.5), ('out_dropout', 0.33), ('epochs', 10), ('batch_size', 16), ('learning_rate', 0.001), ('weight_decay', 0), ('clip_grad_norm', 5.0), ('bert_name', 'bert-base-chinese'), ('bert_learning_rate', 5e-06), ('warm_factor', 0.1), ('use_bert_last_4_layers', False), ('seed', 123), ('config', 'config/resume-zh.json'), ('device', 'cpu'), ('fp16', False), ('use_precision_alignment', False)])
2023-05-26 01:39:08 - INFO: Loading Data
2023-05-26 01:39:08 - INFO:
+-----------+-----------+----------+
| resume-zh | sentences | entities |
+-----------+-----------+----------+
| train | 3819 | 13438 |
| dev | 463 | 1497 |
| test | 477 | 1630 |
+-----------+-----------+----------+
2023-05-26 01:39:26 - INFO: Building Model
Some weights of the model checkpoint at bert-base-chinese were not used when initializing BertModel: ['cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight']
This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
2023-05-26 01:39:28 - INFO: Epoch: 0
---start training---
s222 x type: torch.float32
s222 x type: torch.Size([15, 552, 180, 180])
/data/zhejiang/yusl/anaconda3/envs/w2ner_teco/lib/python3.8/site-packages/sklearn/metrics/_classification.py:1308: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use zero_division parameter to control this behavior.
_warn_prf(average, modifier, msg_start, len(result))
2023-05-26 01:44:07 - INFO: EVAL Label F1 [0.993111 0. 0. 0. 0. 0. 0. 0.
校友您好,能帮忙看下源码适配CPU运行吗?
2023-05-26 01:39:08 - INFO: dict_items([('dataset', 'resume-zh'), ('dist_emb_size', 20), ('type_emb_size', 20), ('lstm_hid_size', 512), ('conv_hid_size', 96), ('bert_hid_size', 768), ('biaffine_size', 512), ('ffnn_hid_size', 288), ('dilation', [1, 2, 3]), ('emb_dropout', 0.5), ('conv_dropout', 0.5), ('out_dropout', 0.33), ('epochs', 10), ('batch_size', 16), ('learning_rate', 0.001), ('weight_decay', 0), ('clip_grad_norm', 5.0), ('bert_name', 'bert-base-chinese'), ('bert_learning_rate', 5e-06), ('warm_factor', 0.1), ('use_bert_last_4_layers', False), ('seed', 123), ('config', 'config/resume-zh.json'), ('device', 'cpu'), ('fp16', False), ('use_precision_alignment', False)]) 2023-05-26 01:39:08 - INFO: Loading Data 2023-05-26 01:39:08 - INFO: +-----------+-----------+----------+ | resume-zh | sentences | entities | +-----------+-----------+----------+ | train | 3819 | 13438 | | dev | 463 | 1497 | | test | 477 | 1630 | +-----------+-----------+----------+ 2023-05-26 01:39:26 - INFO: Building Model Some weights of the model checkpoint at bert-base-chinese were not used when initializing BertModel: ['cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight']
zero_division
parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result)) 2023-05-26 01:44:07 - INFO: EVAL Label F1 [0.993111 0. 0. 0. 0. 0. 0. 0.