Open Liyu202312 opened 9 months ago
My camera is 30fps, I see the code you provided
If (! MNeedData) { If (mOptions. frameCount% 25==0) LogVerbose (LOG_GSTREAMER "gstEncoder -- pipeline full, skipping frame% zu (% ux% u,% zu bytes) \ n" Return true; }
Does it mean that the encoder cannot encode it? Causing frame loss processing?
When I change If (mOptions. frameCount% 25==0) to If (mOptions. frameCount% 30==0), the probability of image lag decreases
command line: taskset -c 5 video-viewer /dev/video0 --bitrate=1500000 --output-codec=h265 --headless rtsp://192.168.144.100:8554/test
camera is 1080P yuv sensor
@Liyu202312 changing the print won't do it, but you can try --output-save=temp.mp4
and see if that changes it. The idea being it keeps the sink open and needing frames. If that works, you can ln -s /dev/null /tmp/null.mp4
and do --output-save=/tmp/null.mp4
instead so the file doesn't accumulate disk space.
@dusty-nv Is it because my previous version did not switch to the L4T-R35.3.1 branch? I used the master branch before. Is there any problem with the master branch?
The problem should be an error in switching branches. When I switch to the master branch, the phenomenon reappears.
Hmm I don't believe the RTSP code in jetson-inference/jetson-utils has changed since R35.3.1... did you try the --output-save=temp.mp4
trick?
Hello!
I have a question here that I would like to ask.
My platform is xiaver nx, with SDK version jp5.1.1. When I created an RTSP server through video viewer using a yuv camera, I found that there was a video lag issue when the client pulled the stream, prompting "gstEncoder -- pipeline full, skipping frame". May I ask what the reason is