NVIDIA-AI-IOT / ros2_trt_pose

ROS 2 package for "trt_pose": real-time human pose estimation on NVIDIA Jetson Platform
MIT License
63 stars 16 forks source link

Getting Error cuDNN_STATUS_NOT_INITIALIZED #8

Open jptalledo10 opened 3 years ago

jptalledo10 commented 3 years ago

Hi

Just trying to run this ros2 module on a desktop PC running Ubuntu 18.04 and ROS2 foxy but getting the following error:

ros2 run ros2_trt_pose pose-estimation --ros-args -p base_dir:='/home/jvilela/pose' [INFO] [1615314429.365317977] [trt_pose]: Loading model weights

Traceback (most recent call last): File "/home/jvilela/ros2_ws/install/ros2_trt_pose/lib/ros2_trt_pose/pose-estimation", line 33, in sys.exit(load_entry_point('ros2-trt-pose==0.0.0', 'console_scripts', 'pose-estimation')()) File "/home/jvilela/ros2_ws/install/ros2_trt_pose/lib/python3.6/site-packages/ros2_trt_pose/live_demo.py", line 25, in main trt_pose.start() File "/home/jvilela/ros2_ws/install/ros2_trt_pose/lib/python3.6/site-packages/ros2_trt_pose/helper.py", line 81, in start model_weights=self.model_weights) File "/home/jvilela/ros2_ws/install/ros2_trt_pose/lib/python3.6/site-packages/ros2_trt_pose/utils.py", line 79, in load_model model_trt = torch2trt.torch2trt(model, [data], fp16_mode=True, max_workspace_size=1<<25) File "/usr/local/lib/python3.6/dist-packages/torch2trt-0.2.0-py3.6-linux-x86_64.egg/torch2trt/torch2trt.py", line 517, in torch2trt outputs = module(inputs) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(input, kwargs) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/container.py", line 119, in forward input = module(input) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(*input, *kwargs) File "/usr/local/lib/python3.6/dist-packages/trt_pose-0.0.1-py3.6-linux-x86_64.egg/trt_pose/models/resnet.py", line 14, in forward x = self.resnet.conv1(x) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(input, kwargs) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 399, in forward return self._conv_forward(input, self.weight, self.bias) File "/home/jvilela/.local/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 396, in _conv_forward self.padding, self.dilation, self.groups) RuntimeError: cuDNN error: CUDNN_STATUS_NOT_INITIALIZED

I have CUDA 11.1 installed CUDNN is also installed

Any insights what could be the problem?

ak-nv commented 3 years ago

This is supported currently for eloquent. Also, which version on TensorRT are you running? find out using dpkg -l | grep tensorrt

jptalledo10 commented 3 years ago

@ak-nv

I have the following on my system:

ii nv-tensorrt-repo-ubuntu1804-cuda10.2-trt5.1.4.2-ga-20190506 1-1 amd64 nv-tensorrt repository configuration files ii nv-tensorrt-repo-ubuntu1804-cuda10.2-trt7.2.3.4-ga-20210226 1-1 amd64 nv-tensorrt reposito

I will try to use eloquent and see what I get.

ak-nv commented 3 years ago

I have tested it with 7.1.3.0-1+cuda10.2 on NVIDIA Jetson.