dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
https://developer.nvidia.com/embedded/twodaystoademo
MIT License
7.75k stars 2.97k forks source link

about "gstEncoder -- pipeline full, skipping frame" #1765

Open Liyu202312 opened 9 months ago

Liyu202312 commented 9 months ago

Hello!

I have a question here that I would like to ask.

My platform is xiaver nx, with SDK version jp5.1.1. When I created an RTSP server through video viewer using a yuv camera, I found that there was a video lag issue when the client pulled the stream, prompting "gstEncoder -- pipeline full, skipping frame". May I ask what the reason is

Liyu202312 commented 9 months ago

My camera is 30fps, I see the code you provided

If (! MNeedData) { If (mOptions. frameCount% 25==0) LogVerbose (LOG_GSTREAMER "gstEncoder -- pipeline full, skipping frame% zu (% ux% u,% zu bytes) \ n" Return true; }

Does it mean that the encoder cannot encode it? Causing frame loss processing?

Liyu202312 commented 9 months ago

When I change If (mOptions. frameCount% 25==0) to If (mOptions. frameCount% 30==0), the probability of image lag decreases

Liyu202312 commented 9 months ago

command line: taskset -c 5 video-viewer /dev/video0 --bitrate=1500000 --output-codec=h265 --headless rtsp://192.168.144.100:8554/test

Liyu202312 commented 9 months ago

camera is 1080P yuv sensor

dusty-nv commented 9 months ago

@Liyu202312 changing the print won't do it, but you can try --output-save=temp.mp4 and see if that changes it. The idea being it keeps the sink open and needing frames. If that works, you can ln -s /dev/null /tmp/null.mp4 and do --output-save=/tmp/null.mp4 instead so the file doesn't accumulate disk space.

Liyu202312 commented 9 months ago

@dusty-nv Is it because my previous version did not switch to the L4T-R35.3.1 branch? I used the master branch before. Is there any problem with the master branch?

Liyu202312 commented 9 months ago

The problem should be an error in switching branches. When I switch to the master branch, the phenomenon reappears.

dusty-nv commented 9 months ago

Hmm I don't believe the RTSP code in jetson-inference/jetson-utils has changed since R35.3.1... did you try the --output-save=temp.mp4 trick?