baidu-research / DeepBench

Benchmarking Deep Learning operations on different hardware
Apache License 2.0
1.07k stars 239 forks source link

TensorRT support for optimized inference results on Nvidia? #93

Closed oscarbg closed 6 years ago

oscarbg commented 6 years ago

Hi, have been testing inference results on my GTX 970 card and seems from compilation of NV benchmarks inference benchmarks use also CUDNN.. wouldn't be getting better results if TensorRT was used? TensorRT isn't meant for optimized inference performance vs CUDDN.. thanks..

sharannarang commented 6 years ago

Thanks for this suggestion. We aren't planning to support TensorRT as of now. Happy to accept contributions from the community for this feature.

oscarbg commented 6 years ago

Thanks for info..

psyhtest commented 6 years ago

@oscarbg If you are interested, Collective Knowledge supports TensorRT benchmarking (with some older TX1 results publicly available).