dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
https://developer.nvidia.com/embedded/twodaystoademo
MIT License
7.85k stars 2.98k forks source link

GStreamer not working on Windows 10 #968

Closed rajeshroy402 closed 1 year ago

rajeshroy402 commented 3 years ago

Hi, I have issues with the remote streaming of my MIPI CSI Camera on my Windows/Linux machine. I have tested the video-viewer on my Jetson Nano 4GB but the RTP is not showing up.

So, let me take you towards my commands: I first logged into Jetson nano, then ran the docker/run.sh to use this command syntax -- video-viewer csi://0 rtp://<remote-ip>:1234

It runs and shows something like this:

[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera -- attempting to create device csi://0
[gstreamer] gstCamera pipeline string:
[gstreamer] nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, framerate=30/1, format=(string)NV12 ! nvvidconv flip-method=2 ! video/x-raw ! appsink name=mysink
nvbuf_utils: Could not get EGL display connection
[gstreamer] gstCamera successfully created device csi://0
[video]  created gstCamera from csi://0
------------------------------------------------
gstCamera video options:
------------------------------------------------
  -- URI: csi://0
     - protocol:  csi
     - location:  0
  -- deviceType: csi
  -- ioType:     input
  -- codec:      raw
  -- width:      1280
  -- height:     720
  -- frameRate:  30.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: rotate-180
  -- loop:       0
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! omxh264enc bitrate=4000000 ! video/x-h264 !  rtph264pay config-interval=1 ! udpsink host=MSI port=1234 auto-multicast=true
[video]  created gstEncoder from rtp://MSI:1234
------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: rtp://MSI:1234
     - protocol:  rtp
     - location:  MSI
     - port:      1234
  -- deviceType: ip
  -- ioType:     output
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  30.000000
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
------------------------------------------------
[OpenGL] failed to open X11 server connection.
[OpenGL] failed to create X11 Window.
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> nvarguscamerasrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvarguscamerasrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvarguscamerasrc0
[gstreamer] gstreamer message stream-start ==> pipeline0
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 4
   Output Stream W = 1280 H = 720
   seconds to Run    = 0
   Frame Rate = 120.000005
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstCamera -- map buffer size was less than max size (1382400 vs 1382407)
[gstreamer] gstCamera recieve caps:  video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)NV12
[gstreamer] gstCamera -- recieved first frame, codec=raw format=nv12 width=1280 height=720 size=1382407
RingBuffer -- allocated 4 buffers (1382407 bytes each, 5529628 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer -- allocated 4 buffers (2764800 bytes each, 11059200 bytes total)
video-viewer:  captured 1 frames (1280 x 720)
RingBuffer -- allocated 2 buffers (1382400 bytes each, 2764800 bytes total)
[gstreamer] gstEncoder-- starting pipeline, transitioning to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> udpsink0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264pay0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter2
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer changed state from NULL to READY ==> mysource
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264pay0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter2
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysource
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline1
[gstreamer] gstreamer message new-clock ==> pipeline1
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264pay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysource
[gstreamer] gstEncoder -- new caps: video/x-raw, width=1280, height=720, format=(string)I420, framerate=30/1
video-viewer:  captured 2 frames (1280 x 720)
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 40
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
video-viewer:  captured 3 frames (1280 x 720)
[gstreamer] gstreamer message stream-start ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsink0
[gstreamer] gstreamer message async-done ==> pipeline1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsink0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline1
video-viewer:  captured 4 frames (1280 x 720)
video-viewer:  captured 5 frames (1280 x 720)
video-viewer:  captured 6 frames (1280 x 720)
video-viewer:  captured 7 frames (1280 x 720)
video-viewer:  captured 8 frames (1280 x 720)
video-viewer:  captured 9 frames (1280 x 720)
video-viewer:  captured 10 frames (1280 x 720)
video-viewer:  captured 11 frames (1280 x 720)
video-viewer:  captured 12 frames (1280 x 720)
video-viewer:  captured 13 frames (1280 x 720)
video-viewer:  captured 14 frames (1280 x 720)
video-viewer:  captured 15 frames (1280 x 720)
video-viewer:  captured 16 frames (1280 x 720)
video-viewer:  captured 17 frames (1280 x 720)
video-viewer:  captured 18 frames (1280 x 720)
video-viewer:  captured 19 frames (1280 x 720)
video-viewer:  captured 20 frames (1280 x 720)
video-viewer:  captured 21 frames (1280 x 720)
video-viewer:  captured 22 frames (1280 x 720)
video-viewer:  captured 23 frames (1280 x 720)
video-viewer:  captured 24 frames (1280 x 720)
video-viewer:  captured 25 frames (1280 x 720)
video-viewer:  captured 26 frames (1280 x 720)
video-viewer:  captured 27 frames (1280 x 720)
video-viewer:  captured 28 frames (1280 x 720)
video-viewer:  captured 29 frames (1280 x 720)
video-viewer:  captured 30 frames (1280 x 720)
video-viewer:  captured 31 frames (1280 x 720)
video-viewer:  captured 32 frames (1280 x 720)
video-viewer:  captured 33 frames (1280 x 720)
video-viewer:  captured 34 frames (1280 x 720)
video-viewer:  captured 35 frames (1280 x 720)
video-viewer:  captured 36 frames (1280 x 720)
video-viewer:  captured 37 frames (1280 x 720)
video-viewer:  captured 38 frames (1280 x 720)
video-viewer:  captured 39 frames (1280 x 720)
video-viewer:  captured 40 frames (1280 x 720)
video-viewer:  captured 41 frames (1280 x 720)
video-viewer:  captured 42 frames (1280 x 720)
video-viewer:  captured 43 frames (1280 x 720)
video-viewer:  captured 44 frames (1280 x 720)
video-viewer:  captured 45 frames (1280 x 720)
video-viewer:  captured 46 frames (1280 x 720)
video-viewer:  captured 47 frames (1280 x 720)
video-viewer:  captured 48 frames (1280 x 720)
video-viewer:  captured 49 frames (1280 x 720)
video-viewer:  captured 50 frames (1280 x 720)
video-viewer:  captured 51 frames (1280 x 720)
video-viewer:  captured 52 frames (1280 x 720)
video-viewer:  captured 53 frames (1280 x 720)
video-viewer:  captured 54 frames (1280 x 720)
video-viewer:  captured 55 frames (1280 x 720)
video-viewer:  captured 56 frames (1280 x 720)
video-viewer:  captured 57 frames (1280 x 720)
video-viewer:  captured 58 frames (1280 x 720)
video-viewer:  captured 59 frames (1280 x 720)
video-viewer:  captured 60 frames (1280 x 720)
video-viewer:  captured 61 frames (1280 x 720)

Now, when I use this command $ gst-launch-1.0 -v udpsrc port=1234 \ caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! \ rtph264depay ! decodebin ! videoconvert ! autovideosink it gives the following result on the terminal

Setting pipeline to PAUSED ...
nvbuf_utils: Could not get EGL display connection
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
New clock: GstSystemClock

And nothing happens here after... Please let me know what's wrong... I am following the same commands on my guest Ubuntu OS and the same issues I have faced.

dusty-nv commented 3 years ago

Hi @rajeshroy402, can you ping the PC you are sending the RTP stream to from your Jetson?

When you run video-viewer, can you try putting the IP address of your PC here:

video-viewer csi://0 rtp://<192.168.1.100>:1234

You also may want to try changing the port to 5000 in case there is some firewall on your PC running. Try disabling firewall if you have one.

Also, on your PC, do you see the RX packets on your network interface going up when the stream is being sent?

rajeshroy402 commented 3 years ago

Hi, thanks for reaching out. Tried everything like instructed by you but it's still not working.. I logged into Jetson via USB and then I opened the docker container and ran the rtp command. I have disabled the firewall and also tried the permutation of port values but it is still not helping. Okay, I will show you my logs for what is happening..

root@jetson-desktop:/jetson-inference# video-viewer csi://0 rtp://MSI:5000
[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera -- attempting to create device csi://0
[gstreamer] gstCamera pipeline string:
[gstreamer] nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, framerate=30/1, format=(string)NV12 ! nvvidconv flip-method=2 ! video/x-raw ! appsink name=mysink
nvbuf_utils: Could not get EGL display connection
[gstreamer] gstCamera successfully created device csi://0
[video]  created gstCamera from csi://0
------------------------------------------------
gstCamera video options:
------------------------------------------------
  -- URI: csi://0
     - protocol:  csi
     - location:  0
  -- deviceType: csi
  -- ioType:     input
  -- codec:      raw
  -- width:      1280
  -- height:     720
  -- frameRate:  30.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: rotate-180
  -- loop:       0
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! omxh264enc bitrate=4000000 ! video/x-h264 !  rtph264pay config-interval=1 ! udpsink host=MSI port=5000 auto-multicast=true
[video]  created gstEncoder from rtp://MSI:5000
------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: rtp://MSI:5000
     - protocol:  rtp
     - location:  MSI
     - port:      5000
  -- deviceType: ip
  -- ioType:     output
  -- codec:      h264
  -- width:      0
  -- height:     0
  -- frameRate:  30.000000
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
------------------------------------------------
[OpenGL] failed to open X11 server connection.
[OpenGL] failed to create X11 Window.
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> nvarguscamerasrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvarguscamerasrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvarguscamerasrc0
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0
   Camera mode  = 4
   Output Stream W = 1280 H = 720
   seconds to Run    = 0
   Frame Rate = 120.000005
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstCamera -- map buffer size was less than max size (1382400 vs 1382407)
[gstreamer] gstCamera recieve caps:  video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)NV12
[gstreamer] gstCamera -- recieved first frame, codec=raw format=nv12 width=1280 height=720 size=1382407
RingBuffer -- allocated 4 buffers (1382407 bytes each, 5529628 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
RingBuffer -- allocated 4 buffers (2764800 bytes each, 11059200 bytes total)
video-viewer:  captured 1 frames (1280 x 720)
RingBuffer -- allocated 2 buffers (1382400 bytes each, 2764800 bytes total)
[gstreamer] gstEncoder-- starting pipeline, transitioning to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> udpsink0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264pay0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter2
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer changed state from NULL to READY ==> mysource
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264pay0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter2
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysource
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline1
[gstreamer] gstreamer message new-clock ==> pipeline1
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264pay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264enc-omxh264enc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysource
[gstreamer] gstEncoder -- new caps: video/x-raw, width=1280, height=720, format=(string)I420, framerate=30/1
video-viewer:  captured 2 frames (1280 x 720)
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 40
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message stream-start ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsink0
[gstreamer] gstreamer message async-done ==> pipeline1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsink0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline1
video-viewer:  captured 3 frames (1280 x 720)
video-viewer:  captured 4 frames (1280 x 720)
video-viewer:  captured 5 frames (1280 x 720)
video-viewer:  captured 6 frames (1280 x 720)
video-viewer:  captured 7 frames (1280 x 720)
video-viewer:  captured 8 frames (1280 x 720)
video-viewer:  captured 9 frames (1280 x 720)
video-viewer:  captured 10 frames (1280 x 720)
video-viewer:  captured 11 frames (1280 x 720)
video-viewer:  captured 12 frames (1280 x 720)
video-viewer:  captured 13 frames (1280 x 720)
video-viewer:  captured 14 frames (1280 x 720)
video-viewer:  captured 15 frames (1280 x 720)
video-viewer:  captured 16 frames (1280 x 720)
video-viewer:  captured 17 frames (1280 x 720)
video-viewer:  captured 18 frames (1280 x 720)
video-viewer:  captured 19 frames (1280 x 720)
video-viewer:  captured 20 frames (1280 x 720)
video-viewer:  captured 21 frames (1280 x 720)
video-viewer:  captured 22 frames (1280 x 720)
video-viewer:  captured 23 frames (1280 x 720)
video-viewer:  captured 24 frames (1280 x 720)
video-viewer:  captured 25 frames (1280 x 720)
video-viewer:  captured 26 frames (1280 x 720)
video-viewer:  captured 27 frames (1280 x 720)
video-viewer:  captured 28 frames (1280 x 720)

Now, there is some issue with the openGL I guess... Once the CSI camera is catching the frames, I opened a new powershell window and followed the command there as shown here: C:\WINDOWS\system32> gst-launch-1.0 -v udpsrc port=1234 \ caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! \ rtph264depay ! decodebin ! videoconvert ! autovideosink with the port value same as what I have mentioned in the RTP command.

The error I keep on getting is this:

 gst-launch-1.0 : The term 'gst-launch-1.0' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ gst-launch-1.0 -v udpsrc port=5000 \
+ ~~~~~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (gst-launch-1.0:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException

caps : The term 'caps' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of
the name, or if a path was included, verify that the path is correct and try again.
At line:2 char:2
+  caps = "application/x-rtp, media=(string)video, clock-rate=(int)9000 ...
+  ~~~~
    + CategoryInfo          : ObjectNotFound: (caps:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException

rtph264depay : The term 'rtph264depay' is not recognized as the name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:3 char:2
+  rtph264depay ! decodebin ! videoconvert ! autovideosink
+  ~~~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (rtph264depay:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException

And when I try the same thing with my guest OS Ubuntu, it prints this:

rajesh@rajesh-virtual-machine:~/Desktop$ gst-launch-1.0 -v udpsrc port=5000  caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" !  rtph264depay ! decodebin ! videoconvert ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
Setting pipeline to PLAYING ...
New clock: GstSystemClock

I have installed this version of GStream on Windows: GStream

It's not working!

rajeshroy402 commented 3 years ago

Hi, anyone?

h3retek commented 3 years ago

@rajeshroy402

Please try this outside of the docker, by compiling jetson-inference. Reason being: I run my Nano headless, and I had some Xauth problems with it, because if you ssh into the Nano with user and you run the docker, the docker goes to root. Xauth doesn't like this. So, just compile this project.

After this, I got a (delayed) stream to my PC with: video-viewer --headless /dev/video0 rtp://10.0.0.2:5000 <- please note the --headless, I guess change /dev/video0 to your csi://0

And then on your PC, install vlc and create a file on your desktop (or anywhere): foo.sdp containing:

c=IN IP4 127.0.0.1
m=video 5000 RTP/AVP 96
a=rtpmap:96 H264/90000

where 5000 is the port number used with video-viewer.

run this file with vlc, and presto: ticktock time

ZhichengSong6 commented 3 years ago

@rajeshroy402 Have same question and solve by the follow steps. Try type netstat -ano in cmd and find a local address(I choose 192.168.137.1:139) Use this as your remote ip and port.