marcoslucianops / DeepStream-Yolo

NVIDIA DeepStream SDK 7.0 / 6.4 / 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 / 5.1 implementation for YOLO models
MIT License
1.44k stars 355 forks source link

Unable to Compile Deepstream-yolo on Jetson with Deepstream 6.1 container #230

Closed Vorapolbig closed 2 years ago

Vorapolbig commented 2 years ago

Hi,

similarly to #104

I tried to compile DeepStream-yolo on "nvcr.io/nvidia/deepstream-l4t:6.1-samples" base image and it gaves this error.

make: Entering directory '/home/DeepStream-Yolo/nvdsinfer_custom_impl_Yolo' g++ -c -o utils.o -Wall -std=c++11 -shared -fPIC -Wno-error=deprecated-declarations -I/opt/nvidia/deepstream/deepstream/sources/includes -I/usr/local/cuda-11.4/include utils.cpp In file included from utils.cpp:26: utils.h:36:10: fatal error: NvInfer.h: No such file or directory 36 | #include "NvInfer.h" | ^~~~~~~~~~~ compilation terminated. make: *** [Makefile:70: utils.o] Error 1 make: Leaving directory '/home/DeepStream-Yolo/nvdsinfer_custom_impl_Yolo' The command '/bin/sh -c git clone https://github.com/marcoslucianops/DeepStream-Yolo && cd DeepStream-Yolo && CUDA_VER=11.4 make -C nvdsinfer_custom_impl_Yolo' returned a non-zero code: 2

marcoslucianops commented 2 years ago

Use the devel docker image.

Vorapolbig commented 2 years ago

Devel is not available for Jetson, only base, iot, and sample.

marcoslucianops commented 2 years ago

Sorry, use the iot docker image.

Vorapolbig commented 2 years ago

Hi, Thank you for prompt response,

I tried all of the containers; base, samples, iot, triton (https://catalog.ngc.nvidia.com/orgs/nvidia/containers/deepstream-l4t) All of them gave the same error.

make: Entering directory '/home/DeepStream-Yolo/nvdsinfer_custom_impl_Yolo' g++ -c -o utils.o -Wall -std=c++11 -shared -fPIC -Wno-error=deprecated-declarations -I/opt/nvidia/deepstream/deepstream/sources/includes -I/usr/local/cuda-11.4/include utils.cpp In file included from utils.cpp:26: utils.h:36:10: fatal error: NvInfer.h: No such file or directory 36 | #include "NvInfer.h" | ^~~ compilation terminated.

It seems to be similar problem to https://github.com/wang-xinyu/tensorrtx/issues/272 Do you have any suggest where should I made change to some path?

marcoslucianops commented 2 years ago

I found the problem: With DS 6.1, the container image missed to include certain header files that will be available on host machine with Compute libraries installed from Jetpack.

You need to use the nvcr.io/nvidia/deepstream-l4t:6.1-samples docker image and mount these missing headers from host. Reference: Building DeepStream reference application source inside L4T triton container

I added to the README.md: https://github.com/marcoslucianops/DeepStream-Yolo#docker-usage

Vorapolbig commented 2 years ago

Big thank you for your help!

Mounting missing header files solved the problem. I am able to compile and run it in container now.

robin2008 commented 11 months ago

I hit the same issue on DS 6.0.1 with Jetson Xavier NX (Jetpack 4.6.4, 18.04), there might be some problem in nvidia runtime to automount those configured csv files.

dpkg -l |grep nvidia-container-csv
ii  nvidia-container-csv-cuda                             10.2.460-1                                 arm64        Jetpack CUDA CSV file
ii  nvidia-container-csv-cudnn                            8.2.1.32-1+cuda10.2                        arm64        Jetpack CUDNN CSV file
ii  nvidia-container-csv-tensorrt                         8.2                                        arm64        Jetpack TensorRT CSV file
ii  nvidia-container-csv-visionworks                      1.6.0.501                                  arm64        Jetpack VisionWorks CSV file

ls /etc/nvidia-container-runtime/host-files-for-container.d
cuda.csv  cudnn.csv  l4t.csv  tensorrt.csv  visionworks.csv

There are 5 csv files installed, but only cuda.csv and l4t.csv are auto-mounted, so missing headers and libs defined other three csv file. (NvInfer.h is in tensorrt.csv)

tail tensorrt.csv

lib, /usr/include/aarch64-linux-gnu/NvInferPluginUtils.h
lib, /usr/include/aarch64-linux-gnu/NvCaffeParser.h
lib, /usr/include/aarch64-linux-gnu/NvUffParser.h
lib, /usr/include/aarch64-linux-gnu/NvOnnxConfig.h
lib, /usr/include/aarch64-linux-gnu/NvOnnxParser.h
dir, /usr/lib/python3.6/dist-packages/tensorrt
dir, /usr/lib/python3.6/dist-packages/graphsurgeon
dir, /usr/lib/python3.6/dist-packages/uff
dir, /usr/lib/python3.6/dist-packages/onnx_graphsurgeon
dir, /usr/src/tensorrt

A simple workaround is to append the contents of those missing files to l4t.csv or cuda.csv, then the new container will have those files auto-mounted from host.