NVIDIA-AI-IOT / Deepstream-Dewarper-App

This project demonstrate how to infer and track from a 360 videos by using the dewarper plugin.
44 stars 12 forks source link

------------------------------------------------------

This sample application is no longer maintained

------------------------------------------------------

Deepstream Dewarper App

This project demonstrate how to infer and track from a 360 videos by using the dewarper plugin. Dewarping 360 videos helps to have better inference and tracking accuracy. This can be seen in the image above where the ml model struggles to infer in the original image but does much better in the dewarped surfaces. It also includes a dynamic library libnvdsgst_dewarper.so which has more projection types than the libnvdsgst_dewarper.so file in the DeepStream 5.1

Prequisites:

Please follow instructions in the apps/sample_apps/deepstream-app/README on how to install the prequisites for Deepstream SDK, the DeepStream SDK itself and the apps.

Getting started:

  1. Install Deepstream 5.1 on your platform, verify it is working by running deepstream-app.

  2. Clone the repository preferably in $DEEPSTREAM_DIR/sources/apps/sample_apps.

    $ git clone https://github.com/NVIDIA-AI-IOT/Deepstream-Dewarper-App.git

  3. Replace old dewarper plugin binary with the new binary that includes 15 more projection types. Note: keep the old ones incase you want to revert back and use them

    • Replace the libnvdgst_dewarper.so binary in /opt/nvidia/deepstream/deepstream-5.1/lib/gst-plugins/ with the binary provided in this repo under the plugin_libraries
    • Replace the nvds_dewarper_meta.h file in /opt/nvidia/deepstream/deepstream-5.1/source/includes/
    • note:Jetson library is included here now
  4. Get the Tlt peoplenet model and label file. Download these files under inference_files directory.

  5. Compile the program

    • $ cd deepstream-dewarper-app/
    • $ make
    • $ ./deepstream-dewarper-app [1:file sink|2: fakesink|3:display sink] [1:without tracking| 2: with tracking] [ ] [ ] ... [ ]
    • Single Stream
    • $ ./deepstream-dewarper-app 3 1 file:///home/nvidia/sample_office.mp4 6 one_config_dewarper.txt (to display)
    • // Single Stream for Perspective Projection type (needs config file change)
    • $ ./deepstream-dewarper-app 3 1 file:///home/nvidia/yoga.mp4 0
    • Multi Stream
    • $ ./deepstream-dewarper-app 3 1 file:///home/nvidia/sample_cam6.mp4 6 one_config_dewarper.txt file:///home/nvidia/sample_cam6.mp4 6 one_config_dewarper.txt

The following description focus on the default use-case of detecting people in a cubicle office enviroment but you can use it to test other types of applications that needs the dewarper functionality.

(see Note below). For more information on the general functionality and further examples see the DeepStream Plugin Development Guide.

Dewarping configuration files are provided in dewarper_config_files directory :

Parameters: uri - represents the input video stream

The dewarping parameters for the given camera can be configured in the config file provided to generate dewarp surfaces input video stream.

Note: gst-nvdewarper plugin uses "VRWorks 360 Video SDK". For further details please refer to https://developer.nvidia.com/vrworks/vrworks-360video/download

For description of general dewarper parameters please visit the DeepStream Plugin Development Guide.



Common Fields/Parameters

Please refer to GST-NVDEWARPER configuration file parameters for details.