Closed willixen closed 5 years ago
Likely a graph/model mismatch. By design, nnet decoders now segfault when you use the wrong graph for your model; this avoids a time-consuming extra check in the decoder inner loops.
Likely a graph/model mismatch. By design, nnet decoders now segfault when you use the wrong graph for your model; this avoids a time-consuming extra check in the decoder inner loops.
oh,got it. Thanks Dan.
but here is a following question that I don't know why my model and graph are mismatched. can't I use graph generated by tri5 model in babel_multi dnn model decoding? the defauft recipe seems to decode like this way.
the related code :
utils/mkgraph.sh \ data/$lang/lang_test exp/$lang/tri5 exp/$lang/tri5/graph |tee exp/$lang/tri5/mkgraph.log
I just change data/$lang/lang to data/$lang/lang_test to make sure there is a G.fst.
@danpovey
I don't recall the details of the babel recipe, but you should check what tree the system you are decoding was built with; if it was not the tri5 tree, you would have a problem.
On Sun, Mar 10, 2019 at 12:15 AM willixen notifications@github.com wrote:
Likely a graph/model mismatch. By design, nnet decoders now segfault when you use the wrong graph for your model; this avoids a time-consuming extra check in the decoder inner loops.
oh,got it. Thanks Dan. but here is a following question that I don't know why my model and graph are mismatched. can't I use graph generated by tri5 model in babel_multi dnn model decoding? the defauft recipe seems to decode like this way. the related code : utils/mkgraph.sh \ data/$lang/lang_test exp/$lang/tri5 exp/$lang/tri5/graph |tee exp/$lang/tri5/mkgraph.log I just change data/$lang/lang to data/$lang/lang_test to make sure there is a G.fst.
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/kaldi-asr/kaldi/issues/3082#issuecomment-471248136, or mute the thread https://github.com/notifications/unsubscribe-auth/ADJVu6yfKU3DsyEY21Sxcoqotmx9K4Luks5vVJTlgaJpZM4bnBNG .
Dear all and @danpovey , I want to decode a multitask model training by babel_multi recipe using my own multilingual data. but there occurs a "Segmentation fault" causing core dumped, and I can not find any error info else. I don't think the utterance is too long, cause it it just the same with the training data, and this testing data is work fine in a dnn model. Help me, please.
the log of decode recipe:
the log of decode_dev/log/decode.2.log
Any help will be appreciated! your best willix