Open ckmstydy opened 1 month ago
I haven't used this code for a while, so I'm concerned my response might be incomplete. First, you clone the dme-compunet/YoloV8 code into your project, then in the NuGet package manager, uninstall Microsoft.ML.OnnxRuntime and install Microsoft.ML.OnnxRuntime.Gpu.
Second, add the zlibwapi.dll file to your project and set it to 'Copy to Output Directory'.
Third, when you call YoloV8Builder().WithSessionOptions, add parameters with SessionOptions options = new SessionOptions(); and specify options.AppendExecutionProvider_xxxxxxx(). Below is an example using DirectX12, not the GPU example.
SessionOptions options = new SessionOptions();
options.AppendExecutionProvider_DML();
var predictor2 = new YoloV8Builder().WithSessionOptions(options).UseOnnxModel(new BinarySelector("onnx/yolov8n-seg.onnx")).Build();
return predictor2;"
@ckmstydy Try adding the cuda bin
folder to Path
environment variable
I haven't used this code for a while, so I'm concerned my response might be incomplete. First, you clone the dme-compunet/YoloV8 code into your project, then in the NuGet package manager, uninstall Microsoft.ML.OnnxRuntime and install Microsoft.ML.OnnxRuntime.Gpu.
Second, add the zlibwapi.dll file to your project and set it to 'Copy to Output Directory'.
Third, when you call YoloV8Builder().WithSessionOptions, add parameters with SessionOptions options = new SessionOptions(); and specify options.AppendExecutionProvider_xxxxxxx(). Below is an example using DirectX12, not the GPU example.
SessionOptions options = new SessionOptions(); options.AppendExecutionProvider_DML(); var predictor2 = new YoloV8Builder().WithSessionOptions(options).UseOnnxModel(new BinarySelector("onnx/yolov8n-seg.onnx")).Build(); return predictor2;"
Thanks, but unfortunately it didn't work for me
@ckmstydy Try adding the cuda
bin
folder toPath
environment variable
Have already added. And “nvcc -V ” works correctly.
I have installed CUDA=12.2 and cuDNN,=9.3, but I still get errors when loading the model using YoloPredictor(my onnxRuntime version is 1.19.1):
[ErrorCode:RuntimeException]D:al workl1\slonnxruntimelcorelsessionlprovider bridge ort.cc:1637onnxruntime::ProviderLibrary::Get [ONNXRuntimeError : 1 : FAIL :LoadLibrary failed with error 126 "" when trying to load"D:\OrderTaking\PupilDet-onnx-gpuPupilDetlbin\Debuginet8.0-windcws\runtimes\win-x64\native\onnxruntime providers cuda.dll
var predictor = new YoloPredictor(“/model/best.onnx”);