Open serg06 opened 3 years ago
You can use the ONNX models in a way similar to what is done in the test_inference function. Although the function operates on PyTorch models, the logic is the same - you need to pass the output of the encoder
and the decoder inputs to a loop executing the decoder_iter
as long as not_finished
flag is unset. The output from the last execution of decoder_iter
is then passed to postnet
.
Check PyTorch tutorial for how to do inference with ONNX runtime.
@GrzegorzKarchNV Thanks I got it working. Is there a reason it's split into 3 different models?
@GrzegorzKarchNV Thanks I got it working. Is there a reason it's split into 3 different models?
Can you show me how you got it to work? I'm trying
encoder = onnxruntime.InferenceSession("./out/encoder.onnx")
texts = ["Hello World, good day."]
sequences, sequence_lengths = prepare_input_sequence(texts)
encoder_inputs = {encoder.get_inputs()[0].name: to_numpy(sequences),
encoder.get_inputs()[1].name: to_numpy(sequence_lengths)}
encoder_outs = encoder.run(None, encoder_inputs)
but I'm getting this error that I can't solve!
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running LSTM node. Name:'LSTM_28' Status Message: Invalid value/s in sequence_lens. All values must be > 0 and < seq_length. seq_length=22
Can you show me how you got it to work? I'm trying
encoder = onnxruntime.InferenceSession("./out/encoder.onnx") texts = ["Hello World, good day."] sequences, sequence_lengths = prepare_input_sequence(texts) encoder_inputs = {encoder.get_inputs()[0].name: to_numpy(sequences), encoder.get_inputs()[1].name: to_numpy(sequence_lengths)} encoder_outs = encoder.run(None, encoder_inputs)
but I'm getting this error that I can't solve!
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running LSTM node. Name:'LSTM_28' Status Message: Invalid value/s in sequence_lens. All values must be > 0 and < seq_length. seq_length=22
I have the same error too, did you manage to fix it ?
I have the same error here. Any updates?
Best i managed was feeding it longer inputs until it worked. No idea how that works. Would like to know an actual fix
I used the Tacotron2 -> ONNX export script: PyTorch/SpeechSynthesis/Tacotron2/exports/export_tacotron2_onnx.py
But it produced 3 separate files:
How do we actually use these models with an onnx runtime?