srvk / eesen

The official repository of the Eesen project
http://arxiv.org/abs/1507.08240
Apache License 2.0
824 stars 342 forks source link

decoding without the language model #167

Open HariKrishna-Vydana opened 6 years ago

HariKrishna-Vydana commented 6 years ago

is there a way to decode without considering the influence of language model

fmetze commented 6 years ago

Greedy decoding? You can simply search for the sequence of peaks in the NN output. Or you could create a fake ARPA file format LM that has unity transition probabilities for all words, like a grammar?

ramonsanabria commented 6 years ago

In:

https://github.com/srvk/eesen/blob/tf_clean/tf/ctc-am/test.py

You have --compute_ter that can give you the token error rate (without language model)