Closed asterixvn closed 4 years ago
Project is not updated for latest versions of kaldi. Your problem is likely because of initialization or decoder code and would require code changes to work.
Thanks a lot Ilya,
Do you intend to update the codes? Or do you have any suggestion which part of the code which I should update? It will save a lot of my time.
Or should I convert my model from new version to old version? Thanks.
Best, Asterix
You should look into decoders code, I think for most part this is old 2017 kaldi decoders backported here, they likely do not support new features you need. You either need to backport newer code into existing decoder or create new one.
I can not help you outside of that.
OK thank for your suggestion!
Hi all,
First at all, thanks authors for this great program. We could now make an easy demo with Kaldi models. That's really great!
I tested with the api.ai models (no ivector) and the multi_cn_chain_sp_online models (with ivector) downloaded from the Kaldi's models website. Both seem working well.
However, when I tested a TDNN-F models with pitch and ivector, it is not working. The program output always wrong texts.
For info, I already used --add-pitch=true and --online-pitch-config=conf/online_pitch.conf (online_pitch.conf contains --sample-frequency=16000).
I have no problem with online2-tcp-nnet3-decode-faster : online2-tcp-nnet3-decode-faster \ --samp-freq=16000 \ --frames-per-chunk=20 \ --extra-left-context-initial=0 \ --frame-subsampling-factor=3 \ --feature-type=mfcc \ --mfcc-config=conf/mfcc.conf \ --ivector-extraction-config=conf/ivector_extractor.conf \ --add-pitch=true \ --online-pitch-config=conf/online_pitch.conf \ --endpoint.silence-phones=1:2:3:4:5:6:7:8:9:10:11:12:13:14:15 \ --min-active=200 \ --max-active=7000 \ --beam=15.0 \ --lattice-beam=6.0 \ --acoustic-scale=1.0 \ --port-num=5050 \ final.mdl HCLG.fst words.txt
Is this an incompatibility between Kaldi's version ? Why online2-tcp-nnet3-decode-faster works well ? Does someone have the same problem or have any solution? Thank a lot.
Best regards, Asterix