issues
search
qhduan
/
just_another_seq2seq
Just another seq2seq repo
329
stars
97
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
你好请问能将训练好的模型发一份给我吗?有个类似的项目时间紧不够时间训练了 2238478733@qq.com 感谢
#32
zheroic
opened
4 years ago
1
语料没有到百万行也可以训练吗?
#31
RaymondJSu
opened
5 years ago
0
ValueError: Sample larger than population or is negative
#30
niliusha123
closed
5 years ago
0
用其他的数据训练,一个epoch中step数减少太多
#29
niliusha123
closed
5 years ago
1
你好,readme里的pretrained embedding链接里的文件已经没有了,请问可以再上传一份吗?
#28
Reuben-Yang
opened
5 years ago
1
pretrained_embedding
#27
ECNUHP
opened
5 years ago
1
模型启动出错
#26
ECNUHP
opened
5 years ago
0
线程错误
#25
NexusLee
opened
5 years ago
0
线程问题
#24
charlesXu86
closed
5 years ago
0
antiLM应该仅仅是在testing时候用而非train吧
#23
Leputa
opened
5 years ago
1
哥,请问有seq2seq bot的实现原理吗
#22
ily666666
opened
5 years ago
2
请问如何调整参数,能达到最优效果呀
#21
ily666666
opened
5 years ago
3
Update README.md
#20
lvzhetx
closed
6 years ago
0
Test
#19
qhduan
closed
6 years ago
0
语料库里面有部分数据乱码
#18
lvzhetx
closed
5 years ago
0
beam search generate the same sentence
#17
weiwancheng
opened
6 years ago
0
seq2seq-ner效果询问
#16
qichaotang
closed
6 years ago
2
关于多层双向RNN的实现
#15
shuaihuaiyi
opened
6 years ago
0
如何训练自己搜集的语料
#14
ZNZHL
opened
6 years ago
4
项目被培训机构利用
#13
GaoQ1
closed
6 years ago
5
训练出来的模型预测的时候只能出两种结果
#12
charles0-0
opened
6 years ago
5
loss sometimes jumpes to 200
#11
yzho0907
opened
6 years ago
2
Does pre-trained word embedding help?
#10
yzho0907
closed
6 years ago
1
ner/train_crf_loss.py Error
#9
666XD
closed
3 years ago
0
how about the training time?
#8
yzho0907
closed
6 years ago
2
减少了一些训练轮次,重新验证了一下chatbot是否正常
#7
qhduan
closed
6 years ago
0
训练好测试显示全是标点符号。。。
#6
shizhediao
opened
6 years ago
10
运行 python3 extract_tmx.py 内存不够
#5
Kiteflyingee
opened
6 years ago
1
为什么训练起来特别慢
#4
kingdeewang
opened
6 years ago
8
Test
#3
qhduan
closed
6 years ago
0
训练好后测试显示乱码
#2
fire717
opened
6 years ago
4
just_another_seq2seq/chatbot/train.py
#1
chinalu
closed
6 years ago
1