NVIDIA-AI-IOT / torch2trt

An easy to use PyTorch to TensorRT converter
MIT License
4.55k stars 671 forks source link

Cannot install with plugins #41

Closed smell-fishy closed 5 years ago

smell-fishy commented 5 years ago

Hello, I like this awesome repository and it will be very helpful, but I have some trouble in installing it. When I tried to install: python setup.py install --plugins it gived: creenshot_20190815211539 And then i can not convert model with torch.nn.functional.interpolate.

jaybdub commented 5 years ago

Hi smelly-fishy,

Thanks for reaching out!

Could you try adding -std=c++11 to this line

https://github.com/NVIDIA-AI-IOT/torch2trt/blob/b9a2c3f6f2be432f3caee07cca4a065d60328b72/build.py#L21

So it would read

  command = g++ -c -fPIC $$in -I$cuda_dir/include -I$torch_dir/include -I$torch_dir/include/torch/csrc/api/include -I. -std=c++11

Please let me know if this works for you.

Best, John

smell-fishy commented 5 years ago

Hi smelly-fishy,

Thanks for reaching out!

Could you try adding -std=c++11 to this line

https://github.com/NVIDIA-AI-IOT/torch2trt/blob/b9a2c3f6f2be432f3caee07cca4a065d60328b72/build.py#L21

So it would read

  command = g++ -c -fPIC $$in -I$cuda_dir/include -I$torch_dir/include -I$torch_dir/include/torch/csrc/api/include -I. -std=c++11

Please let me know if this works for you.

Best, John

Thank you! I added the command but now it gives: WeChat Screenshot_20190816085418

smell-fishy commented 5 years ago

Hi, I solved the problem.It is the problem about protobuf version. I changed my protobuf from 2.6.1 to 3.9.1 and I installed successfully. But when i tried to convert my model,it gives: WeChat Screenshot_20190816104008

smell-fishy commented 5 years ago

Thanks! This issue is the same as #20

jaybdub commented 5 years ago

Good to know! Thanks for sharing.

Best, John