dotnet / machinelearning

ML.NET is an open source and cross-platform machine learning framework for .NET.
https://dot.net/ml
MIT License
9.04k stars 1.89k forks source link

Add TensorRT support #6212

Open turowicz opened 2 years ago

turowicz commented 2 years ago

Is your feature request related to a problem? Please describe. Currently the fastest way of executing models for Computer Vision inference is by running a TensorRT-optimised model. It is widely available in C/C++ but you cannot really use it in C#.

Describe the solution you'd like I would like to be able to load the TensorRT engine into C# memory and call it from there using OpenCVSharp's Mat structures.

Describe alternatives you've considered We are currently using Triton Inference Server but it adds overhead time for data serialisation and transmission.

Additional context There are certain scenarios that would benefit greatly from calling a TensorRT model in-process such as Quality Control.

luisquintanilla commented 2 years ago

Hi @turowicz

ML.NET offers the ability to export models to ONNX which from my understanding is one of the supported frameworks.

To export an ML.NET model to ONNX you use the ConvertToOnnx transform.

Here's additional documentation on how to do it as well.

https://docs.microsoft.com/dotnet/machine-learning/how-to-guides/save-load-machine-learning-models-ml-net#save-an-onnx-model-locally

Does that satisfy your requirements?

turowicz commented 2 years ago

I meant the other way round. Load up a TRT model in ML.NET and infer on data.

michaelgsharp commented 2 years ago

What model format are you thinking? Still onxx? and it sounds a large part of the ask is having a way to not have to copy the data, is that correct?

turowicz commented 2 years ago

Model format: TensorRT by NVIDIA

turowicz commented 2 years ago

Load it in C#, run inference. This requires C# externs for TensorRT C runtime

michaelgsharp commented 2 years ago

@luisquintanilla I'll mark this as future for now, but we need to figure out if this aligns with our goals and if so when we would be able to take a look at this.