649453932 / Chinese-Text-Classification-Pytorch

中文文本分类,TextCNN,TextRNN,FastText,TextRCNN,BiLSTM_Attention,DPCNN,Transformer,基于pytorch,开箱即用。
MIT License
5.29k stars 1.23k forks source link

直接用作者的代码和数据集跑textcnn,准确率在0.898 #71

Open sloanqin opened 3 years ago

sloanqin commented 3 years ago

没有达到作者的0.91,大家试的咋样? 是哪里没有配置对么,没改啥,直接跑的作者的数据集,词向量使用的随机初始化。

Loading data... 171223it [00:01, 158092.93it/s]Vocab size: 4762 180000it [00:01, 163141.73it/s] 180000it [00:01, 90840.01it/s] 10000it [00:00, 85676.20it/s] 9398it [00:00, 93301.18it/s]Time usage: 0:00:03 10000it [00:00, 93711.34it/s] <bound method Module.parameters of Model( (embedding): Embedding(4762, 300, padding_idx=4761) (convs): ModuleList( (0): Conv2d(1, 256, kernel_size=(2, 300), stride=(1, 1)) (1): Conv2d(1, 256, kernel_size=(3, 300), stride=(1, 1)) (2): Conv2d(1, 256, kernel_size=(4, 300), stride=(1, 1)) ) (dropout): Dropout(p=0.5, inplace=False) (fc): Linear(in_features=768, out_features=10, bias=True) )> Epoch [1/20] Iter: 0, Train Loss: 2.4, Train Acc: 8.59%, Val Loss: 2.2, Val Acc: 17.78%, Time: 0:00:04 Iter: 100, Train Loss: 0.96, Train Acc: 71.09%, Val Loss: 0.71, Val Acc: 78.01%, Time: 0:00:26 Iter: 200, Train Loss: 1.0, Train Acc: 70.31%, Val Loss: 0.61, Val Acc: 80.84%, Time: 0:00:48 Iter: 300, Train Loss: 0.73, Train Acc: 75.78%, Val Loss: 0.53, Val Acc: 83.78%, Time: 0:01:10 Iter: 400, Train Loss: 0.89, Train Acc: 76.56%, Val Loss: 0.51, Val Acc: 84.35%, Time: 0:01:43 Iter: 500, Train Loss: 0.62, Train Acc: 81.25%, Val Loss: 0.49, Val Acc: 85.00%, Time: 0:02:15 Iter: 600, Train Loss: 0.63, Train Acc: 79.69%, Val Loss: 0.49, Val Acc: 84.84%, Time: 0:02:48 Iter: 700, Train Loss: 0.76, Train Acc: 74.22%, Val Loss: 0.45, Val Acc: 85.99%, Time: 0:03:19 Iter: 800, Train Loss: 0.68, Train Acc: 81.25%, Val Loss: 0.44, Val Acc: 86.40%, Time: 0:03:50 Iter: 900, Train Loss: 0.58, Train Acc: 85.94%, Val Loss: 0.43, Val Acc: 86.74%, Time: 0:04:22 Iter: 1000, Train Loss: 0.42, Train Acc: 85.16%, Val Loss: 0.44, Val Acc: 86.51%, Time: 0:04:53 Iter: 1100, Train Loss: 0.51, Train Acc: 85.16%, Val Loss: 0.43, Val Acc: 86.73%, Time: 0:05:23 Iter: 1200, Train Loss: 0.44, Train Acc: 83.59%, Val Loss: 0.42, Val Acc: 86.98%, Time: 0:05:54 Iter: 1300, Train Loss: 0.6, Train Acc: 82.03%, Val Loss: 0.42, Val Acc: 86.71%, Time: 0:06:22 Iter: 1400, Train Loss: 0.63, Train Acc: 79.69%, Val Loss: 0.41, Val Acc: 87.39%, Time: 0:06:49 Epoch [2/20] Iter: 1500, Train Loss: 0.51, Train Acc: 84.38%, Val Loss: 0.4, Val Acc: 88.07%, Time: 0:07:20 Iter: 1600, Train Loss: 0.4, Train Acc: 89.06%, Val Loss: 0.4, Val Acc: 87.81%, Time: 0:07:50 Iter: 1700, Train Loss: 0.47, Train Acc: 85.16%, Val Loss: 0.39, Val Acc: 88.21%, Time: 0:08:21 Iter: 1800, Train Loss: 0.45, Train Acc: 86.72%, Val Loss: 0.4, Val Acc: 87.75%, Time: 0:08:51 Iter: 1900, Train Loss: 0.38, Train Acc: 90.62%, Val Loss: 0.39, Val Acc: 88.06%, Time: 0:09:22 Iter: 2000, Train Loss: 0.43, Train Acc: 85.16%, Val Loss: 0.38, Val Acc: 88.12%, Time: 0:09:52 Iter: 2100, Train Loss: 0.51, Train Acc: 83.59%, Val Loss: 0.38, Val Acc: 88.60%, Time: 0:10:21 Iter: 2200, Train Loss: 0.41, Train Acc: 83.59%, Val Loss: 0.38, Val Acc: 88.64%, Time: 0:10:52 Iter: 2300, Train Loss: 0.38, Train Acc: 89.84%, Val Loss: 0.38, Val Acc: 88.51%, Time: 0:11:21 Iter: 2400, Train Loss: 0.42, Train Acc: 87.50%, Val Loss: 0.4, Val Acc: 88.15%, Time: 0:11:50 Iter: 2500, Train Loss: 0.29, Train Acc: 89.06%, Val Loss: 0.37, Val Acc: 88.90%, Time: 0:12:19 Iter: 2600, Train Loss: 0.49, Train Acc: 84.38%, Val Loss: 0.37, Val Acc: 88.58%, Time: 0:12:48 Iter: 2700, Train Loss: 0.35, Train Acc: 85.16%, Val Loss: 0.37, Val Acc: 88.66%, Time: 0:13:19 Iter: 2800, Train Loss: 0.52, Train Acc: 81.25%, Val Loss: 0.38, Val Acc: 88.16%, Time: 0:13:49 Epoch [3/20] Iter: 2900, Train Loss: 0.55, Train Acc: 85.16%, Val Loss: 0.37, Val Acc: 88.50%, Time: 0:14:18 Iter: 3000, Train Loss: 0.39, Train Acc: 86.72%, Val Loss: 0.36, Val Acc: 89.12%, Time: 0:14:47 Iter: 3100, Train Loss: 0.33, Train Acc: 89.06%, Val Loss: 0.38, Val Acc: 88.09%, Time: 0:15:16 Iter: 3200, Train Loss: 0.42, Train Acc: 85.94%, Val Loss: 0.37, Val Acc: 88.62%, Time: 0:15:45 Iter: 3300, Train Loss: 0.51, Train Acc: 85.94%, Val Loss: 0.37, Val Acc: 88.65%, Time: 0:16:15 Iter: 3400, Train Loss: 0.5, Train Acc: 84.38%, Val Loss: 0.36, Val Acc: 89.06%, Time: 0:16:43 Iter: 3500, Train Loss: 0.28, Train Acc: 90.62%, Val Loss: 0.36, Val Acc: 88.93%, Time: 0:17:13 Iter: 3600, Train Loss: 0.2, Train Acc: 93.75%, Val Loss: 0.36, Val Acc: 88.86%, Time: 0:17:41 Iter: 3700, Train Loss: 0.42, Train Acc: 85.16%, Val Loss: 0.37, Val Acc: 88.84%, Time: 0:18:10 Iter: 3800, Train Loss: 0.4, Train Acc: 85.16%, Val Loss: 0.38, Val Acc: 88.49%, Time: 0:18:40 Iter: 3900, Train Loss: 0.45, Train Acc: 87.50%, Val Loss: 0.37, Val Acc: 88.70%, Time: 0:19:10 Iter: 4000, Train Loss: 0.36, Train Acc: 88.28%, Val Loss: 0.37, Val Acc: 88.70%, Time: 0:19:39 Iter: 4100, Train Loss: 0.4, Train Acc: 85.94%, Val Loss: 0.35, Val Acc: 89.06%, Time: 0:20:08 * Iter: 4200, Train Loss: 0.4, Train Acc: 87.50%, Val Loss: 0.37, Val Acc: 88.70%, Time: 0:20:37 Epoch [4/20] Iter: 4300, Train Loss: 0.39, Train Acc: 86.72%, Val Loss: 0.37, Val Acc: 88.86%, Time: 0:21:06 Iter: 4400, Train Loss: 0.41, Train Acc: 89.06%, Val Loss: 0.36, Val Acc: 89.18%, Time: 0:21:36 Iter: 4500, Train Loss: 0.4, Train Acc: 88.28%, Val Loss: 0.36, Val Acc: 89.00%, Time: 0:22:04 Iter: 4600, Train Loss: 0.29, Train Acc: 90.62%, Val Loss: 0.37, Val Acc: 88.91%, Time: 0:22:34 Iter: 4700, Train Loss: 0.43, Train Acc: 89.06%, Val Loss: 0.37, Val Acc: 88.64%, Time: 0:23:02 Iter: 4800, Train Loss: 0.31, Train Acc: 87.50%, Val Loss: 0.37, Val Acc: 88.96%, Time: 0:23:31 Iter: 4900, Train Loss: 0.29, Train Acc: 89.84%, Val Loss: 0.36, Val Acc: 89.17%, Time: 0:24:00 Iter: 5000, Train Loss: 0.27, Train Acc: 90.62%, Val Loss: 0.36, Val Acc: 89.31%, Time: 0:24:30 Iter: 5100, Train Loss: 0.39, Train Acc: 89.84%, Val Loss: 0.36, Val Acc: 89.50%, Time: 0:24:58 No optimization for a long time, auto-stopping... Test Loss: 0.34, Test Acc: 89.80% Precision, Recall and F1-Score... precision recall f1-score support

  finance     0.9165    0.8670    0.8911      1000
   realty     0.9016    0.9350    0.9180      1000
   stocks     0.8424    0.8390    0.8407      1000
education     0.9401    0.9570    0.9485      1000
  science     0.8359    0.8660    0.8507      1000
  society     0.8912    0.8850    0.8881      1000
 politics     0.8784    0.8740    0.8762      1000
   sports     0.9251    0.9510    0.9379      1000
     game     0.9207    0.9050    0.9128      1000

entertainment 0.9308 0.9010 0.9157 1000

 accuracy                         0.8980     10000
macro avg     0.8983    0.8980    0.8979     10000

weighted avg 0.8983 0.8980 0.8979 10000

Confusion Matrix... [[867 18 67 5 8 11 10 11 1 2] [ 12 935 17 3 4 12 5 4 4 4] [ 43 28 839 2 38 4 39 2 5 0] [ 3 1 0 957 7 9 8 4 2 9] [ 1 5 33 8 866 16 23 6 32 10] [ 6 21 2 22 17 885 23 3 5 16] [ 9 14 27 9 23 32 874 6 1 5] [ 2 3 2 3 4 10 7 951 4 14] [ 1 1 5 4 56 6 1 14 905 7] [ 2 11 4 5 13 8 5 27 24 901]] Time usage: 0:00:04

Process finished with exit code 0

sloanqin commented 3 years ago

用fasttext直接跑,词向量random,准确率0.9167

Vocab size: 4762 180000it [00:08, 22208.38it/s] 10000it [00:00, 24515.59it/s] 10000it [00:00, 25578.53it/s] Time usage: 0:00:09 <bound method Module.parameters of Model( (embedding): Embedding(4762, 300, padding_idx=4761) (embedding_ngram2): Embedding(50000, 300) (embedding_ngram3): Embedding(50000, 300) (dropout): Dropout(p=0.5, inplace=False) (fc1): Linear(in_features=900, out_features=256, bias=True) (fc2): Linear(in_features=256, out_features=10, bias=True) )> Epoch [1/20] Iter: 0, Train Loss: 2.3, Train Acc: 10.94%, Val Loss: 2.4, Val Acc: 9.76%, Time: 0:00:01 Iter: 100, Train Loss: 1.1, Train Acc: 63.28%, Val Loss: 1.1, Val Acc: 67.67%, Time: 0:00:26 Iter: 200, Train Loss: 1.1, Train Acc: 64.06%, Val Loss: 0.78, Val Acc: 75.56%, Time: 0:00:47 Iter: 300, Train Loss: 0.79, Train Acc: 75.00%, Val Loss: 0.69, Val Acc: 77.97%, Time: 0:01:09 Iter: 400, Train Loss: 0.91, Train Acc: 70.31%, Val Loss: 0.68, Val Acc: 76.88%, Time: 0:01:30 Iter: 500, Train Loss: 0.68, Train Acc: 76.56%, Val Loss: 0.56, Val Acc: 81.80%, Time: 0:01:51 Iter: 600, Train Loss: 0.6, Train Acc: 76.56%, Val Loss: 0.55, Val Acc: 82.56%, Time: 0:02:13 Iter: 700, Train Loss: 0.72, Train Acc: 77.34%, Val Loss: 0.51, Val Acc: 83.77%, Time: 0:02:35 Iter: 800, Train Loss: 0.63, Train Acc: 78.91%, Val Loss: 0.51, Val Acc: 83.59%, Time: 0:02:56 Iter: 900, Train Loss: 0.55, Train Acc: 85.16%, Val Loss: 0.48, Val Acc: 84.88%, Time: 0:03:18 Iter: 1000, Train Loss: 0.6, Train Acc: 77.34%, Val Loss: 0.48, Val Acc: 84.87%, Time: 0:03:40 Iter: 1100, Train Loss: 0.4, Train Acc: 90.62%, Val Loss: 0.46, Val Acc: 85.72%, Time: 0:04:03 Iter: 1200, Train Loss: 0.48, Train Acc: 82.81%, Val Loss: 0.45, Val Acc: 85.73%, Time: 0:04:27 Iter: 1300, Train Loss: 0.57, Train Acc: 82.03%, Val Loss: 0.44, Val Acc: 86.06%, Time: 0:04:49 Iter: 1400, Train Loss: 0.62, Train Acc: 78.91%, Val Loss: 0.43, Val Acc: 86.95%, Time: 0:05:11 Epoch [2/20] Iter: 1500, Train Loss: 0.55, Train Acc: 83.59%, Val Loss: 0.42, Val Acc: 86.84%, Time: 0:05:33 Iter: 1600, Train Loss: 0.42, Train Acc: 83.59%, Val Loss: 0.42, Val Acc: 86.59%, Time: 0:05:55 Iter: 1700, Train Loss: 0.46, Train Acc: 81.25%, Val Loss: 0.41, Val Acc: 87.07%, Time: 0:06:17 Iter: 1800, Train Loss: 0.38, Train Acc: 89.84%, Val Loss: 0.4, Val Acc: 87.59%, Time: 0:06:39 Iter: 1900, Train Loss: 0.53, Train Acc: 84.38%, Val Loss: 0.4, Val Acc: 87.52%, Time: 0:07:01 Iter: 2000, Train Loss: 0.45, Train Acc: 83.59%, Val Loss: 0.38, Val Acc: 88.18%, Time: 0:07:23 Iter: 2100, Train Loss: 0.44, Train Acc: 82.81%, Val Loss: 0.38, Val Acc: 88.45%, Time: 0:07:44 Iter: 2200, Train Loss: 0.28, Train Acc: 91.41%, Val Loss: 0.37, Val Acc: 88.42%, Time: 0:08:06 Iter: 2300, Train Loss: 0.34, Train Acc: 88.28%, Val Loss: 0.37, Val Acc: 88.61%, Time: 0:08:28 Iter: 2400, Train Loss: 0.48, Train Acc: 85.94%, Val Loss: 0.36, Val Acc: 88.97%, Time: 0:08:50 Iter: 2500, Train Loss: 0.3, Train Acc: 91.41%, Val Loss: 0.36, Val Acc: 89.13%, Time: 0:09:12 Iter: 2600, Train Loss: 0.4, Train Acc: 85.94%, Val Loss: 0.36, Val Acc: 88.92%, Time: 0:09:33 Iter: 2700, Train Loss: 0.34, Train Acc: 89.84%, Val Loss: 0.35, Val Acc: 89.05%, Time: 0:09:55 Iter: 2800, Train Loss: 0.48, Train Acc: 81.25%, Val Loss: 0.35, Val Acc: 89.28%, Time: 0:10:17 Epoch [3/20] Iter: 2900, Train Loss: 0.36, Train Acc: 88.28%, Val Loss: 0.34, Val Acc: 89.68%, Time: 0:10:42 Iter: 3000, Train Loss: 0.33, Train Acc: 89.84%, Val Loss: 0.34, Val Acc: 89.50%, Time: 0:11:07 Iter: 3100, Train Loss: 0.27, Train Acc: 92.19%, Val Loss: 0.34, Val Acc: 89.34%, Time: 0:11:32 Iter: 3200, Train Loss: 0.48, Train Acc: 88.28%, Val Loss: 0.33, Val Acc: 89.55%, Time: 0:11:57 Iter: 3300, Train Loss: 0.34, Train Acc: 91.41%, Val Loss: 0.33, Val Acc: 89.59%, Time: 0:12:22 Iter: 3400, Train Loss: 0.4, Train Acc: 86.72%, Val Loss: 0.33, Val Acc: 89.81%, Time: 0:12:47 Iter: 3500, Train Loss: 0.26, Train Acc: 92.97%, Val Loss: 0.32, Val Acc: 90.35%, Time: 0:13:12 Iter: 3600, Train Loss: 0.24, Train Acc: 92.19%, Val Loss: 0.32, Val Acc: 90.28%, Time: 0:13:36 Iter: 3700, Train Loss: 0.43, Train Acc: 84.38%, Val Loss: 0.32, Val Acc: 90.36%, Time: 0:14:01 Iter: 3800, Train Loss: 0.35, Train Acc: 85.94%, Val Loss: 0.32, Val Acc: 90.04%, Time: 0:14:26 Iter: 3900, Train Loss: 0.34, Train Acc: 89.06%, Val Loss: 0.32, Val Acc: 90.18%, Time: 0:14:51 Iter: 4000, Train Loss: 0.26, Train Acc: 92.97%, Val Loss: 0.32, Val Acc: 90.06%, Time: 0:15:16 Iter: 4100, Train Loss: 0.27, Train Acc: 90.62%, Val Loss: 0.31, Val Acc: 90.31%, Time: 0:15:41 Iter: 4200, Train Loss: 0.36, Train Acc: 91.41%, Val Loss: 0.31, Val Acc: 90.33%, Time: 0:16:06 Epoch [4/20] Iter: 4300, Train Loss: 0.21, Train Acc: 92.19%, Val Loss: 0.31, Val Acc: 90.40%, Time: 0:16:31 Iter: 4400, Train Loss: 0.18, Train Acc: 92.97%, Val Loss: 0.31, Val Acc: 90.59%, Time: 0:16:56 Iter: 4500, Train Loss: 0.38, Train Acc: 87.50%, Val Loss: 0.3, Val Acc: 90.66%, Time: 0:17:21 Iter: 4600, Train Loss: 0.21, Train Acc: 93.75%, Val Loss: 0.32, Val Acc: 90.10%, Time: 0:17:46 Iter: 4700, Train Loss: 0.37, Train Acc: 90.62%, Val Loss: 0.3, Val Acc: 90.98%, Time: 0:18:11 Iter: 4800, Train Loss: 0.14, Train Acc: 94.53%, Val Loss: 0.3, Val Acc: 91.04%, Time: 0:18:36 Iter: 4900, Train Loss: 0.2, Train Acc: 96.09%, Val Loss: 0.3, Val Acc: 90.94%, Time: 0:19:01 Iter: 5000, Train Loss: 0.25, Train Acc: 91.41%, Val Loss: 0.3, Val Acc: 91.02%, Time: 0:19:26 Iter: 5100, Train Loss: 0.22, Train Acc: 92.97%, Val Loss: 0.29, Val Acc: 91.22%, Time: 0:19:51 Iter: 5200, Train Loss: 0.35, Train Acc: 87.50%, Val Loss: 0.29, Val Acc: 91.07%, Time: 0:20:16 Iter: 5300, Train Loss: 0.21, Train Acc: 92.97%, Val Loss: 0.3, Val Acc: 90.79%, Time: 0:20:41 Iter: 5400, Train Loss: 0.42, Train Acc: 86.72%, Val Loss: 0.3, Val Acc: 90.79%, Time: 0:21:06 Iter: 5500, Train Loss: 0.24, Train Acc: 91.41%, Val Loss: 0.3, Val Acc: 91.02%, Time: 0:21:31 Iter: 5600, Train Loss: 0.19, Train Acc: 95.31%, Val Loss: 0.29, Val Acc: 90.96%, Time: 0:21:56 Epoch [5/20] Iter: 5700, Train Loss: 0.3, Train Acc: 90.62%, Val Loss: 0.29, Val Acc: 91.10%, Time: 0:22:22 Iter: 5800, Train Loss: 0.19, Train Acc: 93.75%, Val Loss: 0.29, Val Acc: 91.08%, Time: 0:22:47 Iter: 5900, Train Loss: 0.29, Train Acc: 92.19%, Val Loss: 0.29, Val Acc: 91.36%, Time: 0:23:12 Iter: 6000, Train Loss: 0.29, Train Acc: 89.06%, Val Loss: 0.29, Val Acc: 91.27%, Time: 0:23:37 Iter: 6100, Train Loss: 0.34, Train Acc: 91.41%, Val Loss: 0.29, Val Acc: 91.29%, Time: 0:24:02 Iter: 6200, Train Loss: 0.1, Train Acc: 97.66%, Val Loss: 0.28, Val Acc: 91.49%, Time: 0:24:27 Iter: 6300, Train Loss: 0.2, Train Acc: 90.62%, Val Loss: 0.29, Val Acc: 91.37%, Time: 0:24:52 Iter: 6400, Train Loss: 0.14, Train Acc: 96.88%, Val Loss: 0.29, Val Acc: 91.26%, Time: 0:25:17 Iter: 6500, Train Loss: 0.22, Train Acc: 92.19%, Val Loss: 0.28, Val Acc: 91.41%, Time: 0:25:43 Iter: 6600, Train Loss: 0.27, Train Acc: 90.62%, Val Loss: 0.28, Val Acc: 91.38%, Time: 0:26:08 Iter: 6700, Train Loss: 0.11, Train Acc: 96.88%, Val Loss: 0.28, Val Acc: 91.15%, Time: 0:26:33 Iter: 6800, Train Loss: 0.2, Train Acc: 93.75%, Val Loss: 0.28, Val Acc: 91.37%, Time: 0:26:58 Iter: 6900, Train Loss: 0.18, Train Acc: 92.97%, Val Loss: 0.29, Val Acc: 91.40%, Time: 0:27:23 Iter: 7000, Train Loss: 0.25, Train Acc: 90.62%, Val Loss: 0.28, Val Acc: 91.41%, Time: 0:27:48 Epoch [6/20] Iter: 7100, Train Loss: 0.14, Train Acc: 93.75%, Val Loss: 0.28, Val Acc: 91.45%, Time: 0:28:13 Iter: 7200, Train Loss: 0.23, Train Acc: 92.97%, Val Loss: 0.28, Val Acc: 91.63%, Time: 0:28:38 Iter: 7300, Train Loss: 0.15, Train Acc: 93.75%, Val Loss: 0.28, Val Acc: 91.41%, Time: 0:29:03 Iter: 7400, Train Loss: 0.22, Train Acc: 95.31%, Val Loss: 0.29, Val Acc: 91.38%, Time: 0:29:28 Iter: 7500, Train Loss: 0.084, Train Acc: 98.44%, Val Loss: 0.28, Val Acc: 91.61%, Time: 0:29:53 Iter: 7600, Train Loss: 0.079, Train Acc: 98.44%, Val Loss: 0.28, Val Acc: 91.66%, Time: 0:30:18 * Iter: 7700, Train Loss: 0.21, Train Acc: 94.53%, Val Loss: 0.28, Val Acc: 91.69%, Time: 0:30:43 Iter: 7800, Train Loss: 0.3, Train Acc: 85.94%, Val Loss: 0.28, Val Acc: 91.39%, Time: 0:31:08 Iter: 7900, Train Loss: 0.16, Train Acc: 96.09%, Val Loss: 0.28, Val Acc: 91.56%, Time: 0:31:33 Iter: 8000, Train Loss: 0.19, Train Acc: 94.53%, Val Loss: 0.29, Val Acc: 91.58%, Time: 0:31:58 Iter: 8100, Train Loss: 0.15, Train Acc: 92.97%, Val Loss: 0.28, Val Acc: 91.28%, Time: 0:32:23 Iter: 8200, Train Loss: 0.23, Train Acc: 93.75%, Val Loss: 0.28, Val Acc: 91.35%, Time: 0:32:48 Iter: 8300, Train Loss: 0.12, Train Acc: 95.31%, Val Loss: 0.28, Val Acc: 91.46%, Time: 0:33:13 Iter: 8400, Train Loss: 0.38, Train Acc: 88.28%, Val Loss: 0.28, Val Acc: 91.51%, Time: 0:33:38 Epoch [7/20] Iter: 8500, Train Loss: 0.23, Train Acc: 92.97%, Val Loss: 0.28, Val Acc: 91.69%, Time: 0:34:03 Iter: 8600, Train Loss: 0.13, Train Acc: 96.88%, Val Loss: 0.29, Val Acc: 91.64%, Time: 0:34:28 No optimization for a long time, auto-stopping... Test Loss: 0.26, Test Acc: 91.67% Precision, Recall and F1-Score... precision recall f1-score support

  finance     0.9317    0.8860    0.9083      1000
   realty     0.9285    0.9350    0.9317      1000
   stocks     0.8365    0.8800    0.8577      1000
education     0.9631    0.9390    0.9509      1000
  science     0.8766    0.8740    0.8753      1000
  society     0.8842    0.9160    0.8998      1000
 politics     0.9187    0.8820    0.9000      1000
   sports     0.9818    0.9690    0.9753      1000
     game     0.9571    0.9380    0.9475      1000

entertainment 0.8986 0.9480 0.9226 1000

 accuracy                         0.9167     10000
macro avg     0.9177    0.9167    0.9169     10000

weighted avg 0.9177 0.9167 0.9169 10000

Confusion Matrix... [[886 6 67 5 9 11 9 1 1 5] [ 8 935 20 0 5 14 2 2 1 13] [ 38 25 880 2 27 3 20 0 2 3] [ 1 2 4 939 6 20 7 2 2 17] [ 3 11 33 4 874 14 16 0 26 19] [ 2 16 3 11 14 916 16 0 4 18] [ 8 6 28 7 18 38 882 1 2 10] [ 1 1 3 2 1 4 3 969 0 16] [ 1 1 9 1 35 5 3 1 938 6] [ 3 4 5 4 8 11 2 11 4 948]] Time usage: 0:00:01

Process finished with exit code 0

Mingrg commented 3 years ago

您好,词向量random效果好一些,还是用预训练的好一些?

ohhhhhhhhhhhhhhhh commented 1 year ago

你好,我之前修改了部分代码,发现准确度没有预期中的好,于是用作者的源代码直接运行,模型使用TextCNN,其余配置不动,最终测试集准确度是90.36,也没有达到预期。