microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.7k stars 2.93k forks source link

LNK2001:unresolved external symbol OrtSessionOptionsAppendExecutionProvider_Tensorrt #8681

Open chwjshlwel opened 3 years ago

chwjshlwel commented 3 years ago

Describe the bug I run the onnxruntime c++ code as follows:

#include "stdafx.h"
#include <windows.h>
#include <windowsx.h>
#include <onnxruntime_cxx_api.h>
#include <cuda_provider_factory.h>
#include <onnxruntime_c_api.h>
#include <tensorrt_provider_factory.h>

Ort::Env env{ ORT_LOGGING_LEVEL_WARNING, "test" };
Ort::SessionOptions sf;
int device_id = 0;
Ort::ThrowOnError(OrtSessionOptionsAppendExecutionProvider_Tensorrt(sf, device_id));
Ort::ThrowOnError(OrtSessionOptionsAppendExecutionProvider_CUDA(sf, device_id));
Ort::Session session(env, L"mobilenet.onnx", sf);

System information

To Reproduce when I build the solution, I got the error LNK2001:unresolved external symbol OrtSessionOptionsAppendExecutionProvider_Tensorrt

But I linked both onnxruntime.lib, onnxruntime_provider_cuda.lib,onnxruntime_provider_tensorrt.lib,nvinfer.lib i also put all the dll with the same path with the exe Please help me to solve this problem. Thanks!

yuslepukhin commented 3 years ago

How did you create a solution? This may be helpful.

Does onnxruntime.dll exports the symbol in question? For example, for the CPU only build it is exporting OrtSessionOptionsAppendExecutionProvider_CPU

d:\dev\ort_trans>dumpbin /EXPORTS d:\dev\ort_trans\cpu_build\Debug\Debug\onnxruntime.dll Microsoft (R) COFF/PE Dumper Version 14.29.30040.0 Copyright (C) Microsoft Corporation. All rights reserved.

Dump of file d:\dev\ort_trans\cpu_build\Debug\Debug\onnxruntime.dll

File Type: DLL

Section contains the following exports for onnxruntime.dll [...]

ordinal hint RVA      name

      1    0 000441A7 OrtGetApiBase = @ILT+274850(OrtGetApiBase)
      2    1 0005B7D0 OrtSessionOptionsAppendExecutionProvider_CPU = @ILT+370635(OrtSessionOptionsAppendExecutionProvider_CPU)
chwjshlwel commented 3 years ago

for OrtSessionOptionsAppendExecutionProvider_CPU and OrtSessionOptionsAppendExecutionProvider_CUDA, there are no problem. The problem is just for TensorRT. I check the version, and maybe it is the version problem. I will try the version onnxruntime 1.8.2, cuda 11.1, cudnn 8.0.5, tensorrt 7.2.2.3, and see if there is problem. But when I use CUDA provider, there has some error. If I used the onnxruntime_providers_cuda.dll built by mine, it has errors. While if I used the onnxruntime_providers_cuda.dll downloaded from https://github.com/microsoft/onnxruntime/releases/download/v1.8.1/onnxruntime-win-gpu-x64-1.8.1.zip, it has no error. I think the dlls have problems. It shows build successed while it has error when test_all but I use --skip_test and it is built successed. [Debug.zip](https://github.com/microsoft/onnxruntime/files/6972168/Debug.zip) I attached my onnxruntime_providers_cuda.dll and onnxruntime_providers_tensorrt.dll

chwjshlwel commented 3 years ago

Is any link that I can download onnxruntime_providers_tensorrt.dll. I think I build failed even it shows I build successed.

yuslepukhin commented 3 years ago

Is any link that I can download onnxruntime_providers_tensorrt.dll. I think I build failed even it shows I build successed.

This is DLL is build depended, it must be built together with other DLLs. So you must build something locally, it must come from your build tree.

chwjshlwel commented 3 years ago

Thanks. I upload the onnxruntime_providers_tensorrt.lib and onnxruntime_providers_tensorrt.dll. Please help me to see if these files are right. Debug (2).zip

yuslepukhin commented 3 years ago

Thanks. I upload the onnxruntime_providers_tensorrt.lib and onnxruntime_providers_tensorrt.dll. Please help me to see if these files are right. Debug (2).zip

I can not tell if these DLLs are correct for you build. They are build defendant. Have you followed the instructions to run the build?

RyanUnderhill commented 3 years ago

Just to reaffirm, you need to follow these steps to have TensorRT available:

https://onnxruntime.ai/docs/how-to/build/eps.html#tensorrt

The only library you need to link with in your code is just onnxruntime.lib (or access the onnxruntime.dll through GetProcAddress), onnxruntime.dll will dynamically load the other libraries itself.

stale[bot] commented 2 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.