jiangxinyang227 / textClassifier

tensorflow implementation
1.15k stars 556 forks source link

为什么bert原本的只有MLP层,accuracy能达到86%,加了BiLSTM+Attention却降到了65% #21

Closed SmartMapple closed 3 years ago

SmartMapple commented 3 years ago

加BiLSTM+Attention INFO:tensorflow: Eval results INFO:tensorflow: eval_accuracy = 0.6544118 INFO:tensorflow: eval_auc = 0.52433944 INFO:tensorflow: eval_precision = 0.69602275 INFO:tensorflow: eval_recall = 0.8781362 INFO:tensorflow: global_step = 917 INFO:tensorflow: loss = 0.77603155

bert原生的MLP层 官方参数 but 0 5 dropout keep

SmartMapple commented 3 years ago

it's because of my code. now it works well.