I use the openSeq2Seq to do the asr project. and i change the tf.nn.ctc_loss to warpctc in the code, setting the blank_label parameter to the size of vocabulary . After that, it can train normally without error, but i found that the training loss can not converge, and the prediction is very strange.
[1,0]: Sample target: here is also seen an american indian a cowboy a merchant and an artisan an american flag is borne aloft while four west point cadets suggest training and leadership women relief workers of all kinds are seen
[1,0]: Sample prediction: e e e e e e te e e e e te e e e te e e e e e
when i use tf.nn.ctc_loss is should be like(not so many blank and the letter e):
I use the openSeq2Seq to do the asr project. and i change the tf.nn.ctc_loss to warpctc in the code, setting the blank_label parameter to the size of vocabulary . After that, it can train normally without error, but i found that the training loss can not converge, and the prediction is very strange.
when i use tf.nn.ctc_loss is should be like(not so many blank and the letter e):
here is the code i change
so i wonder how to fix this problem?