Closed zeng-hello-world closed 4 years ago
After loading this TensorRT sample build .so, the PLUGINS_LOADED: True
printed.
from .torch2trt import *
from .converters import *
import tensorrt as trt
def load_plugins():
import os
import ctypes
# ctypes.CDLL(os.path.join(os.path.dirname(__file__), 'libtorch2trt.so'))
ctypes.CDLL("TensorRT-6.0.1.5/samples/python/uff_custom_plugin/build/libclipplugin.so")
print("---------------- after -----------------")
registry = trt.get_plugin_registry()
# test
for plugin in registry.plugin_creator_list:
print("BB: ", plugin.name, plugin.plugin_namespace)
registry = trt.get_plugin_registry()
torch2trt_creators = [c for c in registry.plugin_creator_list if c.plugin_namespace == 'torch2trt']
for c in torch2trt_creators:
registry.register_creator(c, 'torch2trt')
try:
load_plugins()
PLUGINS_LOADED = True
except OSError:
PLUGINS_LOADED = False
print("PLUGINS_LOADED: ", PLUGINS_LOADED)
I get this:
---------------- after -----------------
BB: RnRes2Br1Br2c_TRT
BB: CgPersistentLSTMPlugin_TRT
BB: RnRes2Br2bBr2c_TRT
BB: SingleStepLSTMPlugin
BB: FancyActivation
BB: ResizeNearest
BB: Split
BB: InstanceNormalization
BB: CustomClipPlugin
PLUGINS_LOADED: True
Process finished with exit code 0
So the loading issue maybe caused by the building enviorment.
I can observe the same behavior. When building on a desktop computer, the plugins do not load. We tried to build it on a Jetson AGX yesterday, that worked without problems. I also saw, that in the build.py the lib and include path are set to the ARM version of linux. This has to be changed to /usr/include/x86_64-linux-gnu, /usr/lib/x86_64-linux-gnu on a non-ARM computer. But still it doesn't change anything. Still can't load the plugin :/
I can observe the same behavior. When building on a desktop computer, the plugins do not load. We tried to build it on a Jetson AGX yesterday, that worked without problems. I also saw, that in the build.py the lib and include path are set to the ARM version of linux. This has to be changed to /usr/include/x86_64-linux-gnu, /usr/lib/x86_64-linux-gnu on a non-ARM computer. But still it doesn't change anything. Still can't load the plugin :/
Me too.It works well on my AGX Xavier.When I try to load it on a x86_64 computer.It throws link errors like:
undefined symbol: _ZN3c1014DeviceTypeNameB5cxx11ENS_10DeviceTypeEb
Where do you get this Error ? I dont get any compilation Errors, it just fails to load the Plugin. I also tried to put the Plugin in the trt sample folder where this clipping Plugin is built. But the same, it compiles but it cant be loaded.
aininot260 notifications@github.com schrieb am Sa., 7. Dez. 2019, 03:12:
I can observe the same behavior. When building on a desktop computer, the plugins do not load. We tried to build it on a Jetson AGX yesterday, that worked without problems. I also saw, that in the build.py the lib and include path are set to the ARM version of linux. This has to be changed to /usr/include/x86_64-linux-gnu, /usr/lib/x86_64-linux-gnu on a non-ARM computer. But still it doesn't change anything. Still can't load the plugin :/
Me too.It works well on my AGX Xavier.When I try to load it on a x86_64 computer.It throws link errors like:
undefined symbol: _ZN3c1014DeviceTypeNameB5cxx11ENS_10DeviceTypeEb
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/NVIDIA-AI-IOT/torch2trt/issues/176?email_source=notifications&email_token=AGCDFVXAWVB72GEDUUG7CD3QXMA75A5CNFSM4JVTQ2FKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEGF3KYY#issuecomment-562804067, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGCDFVXDIY6ETMWCP3AYUALQXMA75ANCNFSM4JVTQ2FA .
After installed,modify the init.py in interpolate folder from the python installing site like this:
try:
except OSError:
to
try:
except OSError as e: print(e)
After installed,modify the init.py in interpolate folder from the python installing site like this:
try:
except OSError:
to
try:
except OSError as e: print(e)
I can't find this line in interpolate/__init__.py
Sorry,here: (tensorrt) nvidia@Dell:~/anaconda3/envs/tensorrt/lib/python3.6/site-packages/torch2trt$ vim init.py
modify: try: load_plugins() PLUGINS_LOADED = True except OSError as e: print(e) PLUGINS_LOADED = False @nan0755
After installed,modify the init.py in interpolate folder from the python installing site like this:
try:
except OSError:
to
try:
except OSError as e: print(e)
I got this: libtorch2trt.so: undefined symbol: _ZNK6google8protobuf7Message11GetTypeNameB5cxx11Ev
, sames like a protobuf issue.
looks like this: https://github.com/onnx/onnx/issues/745
That's another linking error about protobuf......
Was the libtorch2trt.so: undefined symbol: _ZNK6google8protobuf7Message11GetTypeNameB5cxx11Ev
error solved? I'm getting the same issue.
After installed,modify the init.py in interpolate folder from the python installing site like this:
try:
except OSError:
to
try:
except OSError as e: print(e) excuse me .didi you fix it ?
I have built torch2trt using commond:
python setup.py install --plugin
, and I notice thelibtorch2trt.so
has been generated. Then I add some debug code in torch2trt/__init__.py as below:when I run
test.py
:I got this:
All the
before
plugins are insterted by TensorRT as default, but I dont know why theafter
sentence does not come out after loadinglibtorch2trt.so
. AndPLUGINS_LOADED: False
means loading try is failed.