Open Tian14267 opened 2 years ago
did you covert to onnx successful ?
@Tian14267 The same problems with you. Besides, when I feed the input with same length but different text token, the onnxruntime inference result is worse because of the mel output length seems like related with input data when trace onnx model. This problem has confused me for several weeks, does anyone can give me some ideas? tks!
@Tian14267 Have you found any way to fix it? i have the same problem as you
Because the output of duration_prediction of different tokens with same length is different. if we convert fastspeech2 to a single onnx model, we finally met the problems above. To fix this, I split the fastspeech2 into three submodels when convert to onnx, and put the length_regulator inference out of onnxruntime. finally, build the sub models inference pipeline with onnxruntime. hope it useful for you!
How Can I get dynamic input in torch model to onnx model ? I give input with dynamic_axes, but the output in inference is not dynamic.
My code
In code , I use
src_lens=10
, it is ok. But in iinference with this onnx model,when I input withsrc_lens=50
or other, I get this error:seems that the input len must be
10
, and it can't be dynamic Does somebody help me ?