huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
3.02k stars 628 forks source link

tinybert在中文数据集iflytek上的效果很差 #167

Open xdnjust opened 2 years ago

xdnjust commented 2 years ago

楼主,你好,我跑tinybert,在iflytek数据集上效果非常差(如下结果),请问一下可能是哪方面的原因? 训练配置如下:未加数据增强,max_seq_length=128,train_batch_size=32,learning_rate=5e-5

02/10 07:18:14 PM Eval results 02/10 07:18:14 PM acc = 0.0011547344110854503 02/10 07:18:14 PM acc_and_f1 = 0.0005870596638409233 02/10 07:18:14 PM att_loss = 0.0 02/10 07:18:14 PM cls_loss = 0.039709030740684076 02/10 07:18:14 PM eval_loss = 4.794164785524694 02/10 07:18:14 PM f1 = 1.9384916596396344e-05 02/10 07:18:14 PM global_step = 1199 02/10 07:18:14 PM loss = 0.039709030740684076 02/10 07:18:14 PM rep_loss = 0.0

pong991 commented 2 years ago

你好,我现在做的任务也是效果较差,请问找到原因了吗