dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
https://developer.nvidia.com/embedded/twodaystoademo
MIT License
7.76k stars 2.97k forks source link

Get stuck and cannot kill when dealing with rtsp stream #1395

Closed luozy-usc closed 1 year ago

luozy-usc commented 2 years ago

Environment: Jetson Xavier NX Version: Jetson-inference build without NVMM

Problem describe: Running python script below to deal with RTSP stream, the terminal will freeze and showing NvMMLiteBlockCreate : Block : BlockType = 261 .

Part of the script shows below :

import jetson.inference
input = jetson.utils.videoSource("rtsp://192.168.1.10:554/test", argv=["--input-codec=h264"])
output = jetson.utils.videoOutput('file:///nvme/test.mp4', argv=["--bitrate=4000000", "--headless"])

PS. This problem only occurs when inputting specific video source. (I used VLC to check the video source, which can be displayed on the screen)

There is an interesting thing: I checked the gstreamer command running behind the script: gst-launch-1.0 rtspsrc location=rtsp://192.168.1.10:554/test latency=2000 ! queue ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv ! video/x-raw, width=(int)1280, height=(int)720, format=(string)NV12 ! videorate drop-only=true max-rate=30 ! appsink name=mysink

The script will get stuck.

To get the debug error, I run the bash command by terminal as below: gst-launch-1.0 rtspsrc location=rtsp://192.168.1.10:554/test latency=2000 debug=1 ! queue ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv ! video/x-raw, width=\(int\)1280, height=\(int\)720, format=\(string\)NV12 ! videorate drop-only=true max-rate=30 ! appsink name=mysink

Surprisingly, the command above can successfully work. (I check this by using jtop and check whether the NvDec is working.)

I'll appreciate it if you can give me some advices to solve this problem.

luozy-usc commented 2 years ago

I found some related issues:

https://github.com/dusty-nv/jetson-utils/issues/102 https://github.com/dusty-nv/jetson-utils/issues/615 https://github.com/dusty-nv/jetson-utils/issues/560

luozy-usc commented 2 years ago

PS: I modified the gst-Decoder.cpp add "\"s and "debug=1" to the variable ss when gst_parse_launch(). But it still doesn't work. Which means, when we parse the ss to gst_parse_launch(), it will not work as the same as command gst-launch-1.0 in terminal.

dusty-nv commented 2 years ago

PS: I modified the gst-Decoder.cpp add "\"s and "debug=1" to the variable ss when gst_parse_launch(). But it still doesn't work.

Hi @luozy-usc, did you recompile/reinstall the code after making these changes?

When you launch the program, it will print out the gstreamer pipeline it uses. Do you see your debug changes reflected in that gstreamer pipeline?

luozy-usc commented 2 years ago

Hi @dusty-nv , thank you for the reply! I did recompile/reinstall the code after making those changes, and there are some debug changes reflected in the pipeline. I found that if I add "\"s to the code, the pipeline will raise some errors, which means the original code should be correct.

I tried to solve the problem by a tricky way.

As the gst-launch-1.0 can receive the rtsp stream successfully, I build a rtsp server to reproduce the input stream, and the output stream from the rtsp-server can be detected by our jetson-inference api.

The method above can work in my case, but the solution is a little bit ugly... I hope to get some suggestion from you.

dusty-nv commented 2 years ago

I found that if I add "\"s to the code

I think that you can use these in bash (i.e. when using gst-launch-1.0) but not directly with the gstreamer API's, so remove these from your customized pipeline (or break it up into multiple ss << mypipelinehere; commands)