NVIDIA-AI-IOT / torch2trt

An easy to use PyTorch to TensorRT converter
MIT License
4.55k stars 671 forks source link

The output is different when load the serialized model in c++ API #59

Closed YirongMao closed 4 years ago

YirongMao commented 4 years ago

Hi, I serialized the model, and then loaded the model in C++ API following https://github.com/NVIDIA-AI-IOT/torch2trt/issues/16

But when I sent a tensor with all ones into the models (pytorch model and serialized model in c++), their outputs are different. when I sent a tensor with all zeros, the outputs are the same.

My TensorRT version is 5.0.2.6 PyTorch version is 1.1.0

YirongMao commented 4 years ago

So sorry. The outputs are the same. There is a problem in my code.