issues
search
wenyu332
/
daguan_bert_ner
30
stars
13
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
楼主,您好,预训练模型的数据集百度云分享链接失效了,能否方便再放一个呢,谢谢。
#24
tiancai134007
opened
2 years ago
2
pred_num = 0?
#23
hrshy0629
closed
3 years ago
1
楼主,您好,请问您知道多标签的序列标注怎么修改bert代码嘛?
#22
jieyang123
opened
4 years ago
0
关于attention
#21
zxbnmop
closed
5 years ago
1
预训练模型的百度云分享链接失效了,能否方便再放一个,谢谢。
#20
zxy951005
opened
5 years ago
1
Training beyond specified 't_total'. Learning rate multiplier set to 0.0. Please set 't_total' of WarmupLinearSchedule correctly
#19
ghost
closed
5 years ago
1
AssertionError
#18
neverstoplearn
opened
5 years ago
1
你好,请教BERT预训练模型
#17
lx86110
closed
5 years ago
5
请教,更改max_seq_len之后还需要改哪些地方
#16
dh12306
opened
5 years ago
5
main()函数中在train()阶段 print(optimizer.get_lr)输出的是[0],这是说明bertAdam优化器的学习率是0?
#15
bytekongfrombupt
opened
5 years ago
2
wenyu332你好。正则优化方式是在BertAdam类中调吗?
#14
lizhzh8
opened
5 years ago
2
训练bert出错,请教一下
#13
luluforever
opened
5 years ago
7
training beyond specified 't_total'. Learning rate multiplier set to 0.0. Please set 't_total' of WarmupLinearSchedule cor
#12
dh12306
closed
5 years ago
1
楼主你的队名是scofyyy这个吗
#11
yangcontrol
closed
5 years ago
3
1.41_150000_step是什么
#10
neverstoplearn
closed
5 years ago
0
wenyu332,你好。我在用你程序训练时,训练次数改成了10,得到predict.txt文件。然后用submit程序生成submit.txt时,发现到一定条数后就是卡住。
#9
lizhzh8
closed
5 years ago
4
您好,请问代码里面的 pytorch_pretrained_bert是什么文件,好像缺少这个文件运行不起来
#8
jieyang123
closed
5 years ago
2
这个模型我们可以哪些方面调优
#7
yangcontrol
opened
5 years ago
9
看代码,wenyu332你使用Torch预训练bert,那有没有TensorFlow预训练bert的参考呢?或者有没有将模型bert_weight.bin转成bert_model.ckpt的方法呢?
#6
lizhzh8
opened
5 years ago
3
Bert模型比word2vec强多少
#5
yangcontrol
opened
5 years ago
1
好像还缺少一些文件
#4
2hip3ng
closed
5 years ago
2
咨询一下bert训练模型
#3
luluforever
opened
5 years ago
3
代码好像运行不了
#2
yangcontrol
opened
5 years ago
3
是不是少一些文件
#1
yangcontrol
closed
5 years ago
0