jaywalnut310 / glow-tts

A Generative Flow for Text-to-Speech via Monotonic Alignment Search
MIT License
667 stars 150 forks source link

Transfer to tensorRT and using triton inference #39

Open andrey999333 opened 4 years ago

andrey999333 commented 4 years ago

Does anyone have experience of converting the models into tensorRT and/or using nvidia triton inference container to serve glow tts models?

arijitx commented 4 years ago

Some time back I tried to convert the provided model to onnx, that time it didnt support out of the box because of the normalizing flows, it will need some work to write them in a onnx compatible way. If you are able to convert to onnx there is higher chance that you will be able to convert that onnx model to tensorrt

andrey999333 commented 4 years ago

i managed to convert the model to torch script with trace, and serve it with triton, but still struggle to convert it to onnx.

ErenBalatkan commented 3 years ago

i managed to convert the model to torch script with trace, and serve it with triton, but still struggle to convert it to onnx.

Greetings, Im having trouble converting to torch script with trace, do you remember what changes you have to make?

Specifically, im facing the following error

    weight = weight.view(self.n_split, self.n_split, 1, 1)
RuntimeError: Cannot insert a Tensor that requires grad as a constant. Consider making it a parameter or input, or detaching the gradient
Tensor:
 0.2230 -0.1144  1.1334 -2.2441
-0.3550 -0.0271 -0.3431 -3.5898
 1.6052  1.2910 -0.2057 -0.1371
 1.3640 -0.7758 -0.0487  0.0655
[ torch.FloatTensor{4,4} ]
andrey999333 commented 3 years ago

i do not remember this error, but i did it quite a while ago, so cannot be sure. To convert to torch script, I have rewritten the whole model only for prediction, removing the training part. In glow models prediction and training are very different, so you can simplify the model a lot if you do not need to support training.

debasish-mihup commented 3 years ago

i do not remember this error, but i did it quite a while ago, so cannot be sure. To convert to torch script, I have rewritten the whole model only for prediction, removing the training part. In glow models prediction and training are very different, so you can simplify the model a lot if you do not need to support training.

Can you share the code that you use to port the inference part to torch script? I am trying to run a model on android platform using the same.

neso613 commented 2 years ago

weight = weight.view(self.n_split, self.n_split, 1, 1) RuntimeError: Cannot insert a Tensor that requires grad as a constant. Consider making it a parameter or input, or detaching the gradient Tensor: 0.2230 -0.1144 1.1334 -2.2441 -0.3550 -0.0271 -0.3431 -3.5898 1.6052 1.2910 -0.2057 -0.1371 1.3640 -0.7758 -0.0487 0.0655 [ torch.FloatTensor{4,4} ]

@andrey999333 What's the latency and performance? Can you please share your observations.

neso613 commented 2 years ago

i do not remember this error, but i did it quite a while ago, so cannot be sure. To convert to torch script, I have rewritten the whole model only for prediction, removing the training part. In glow models prediction and training are very different, so you can simplify the model a lot if you do not need to support training.

@andrey999333 @arijitx I am able to convert the model successfully but getting error while inferencing. https://github.com/rhasspy/glow-tts-train/issues/2

your inputs are much appreciated. Thanks.