Closed goruklu closed 1 month ago
How are you using these frames?
GStreamer will need a non-zero PTS unless we're dealing with the first frame.
Ping? cc @modeveci
@philn We are testing a cloud gaming app (uses WebRTC) on a Broadcom device. Currently seeing video freezing/pacing issues during playback when using the synthesized PTSs. Therefore we wanted to check whether the lowLatencyRendering (PTS=0) mode would work better by forcing every frame to displayed as soon as it's decoded. After reverting https://github.com/WebPlatformForEmbedded/WPEWebKit/commit/07d5b35f4a98599385d3751c975767eed9f00c26, the stream plays with renderPTS=0 but it did not provide any improvements over using non-zero PTSs. We did not see any issues from gstreamer pipeline (we are using westeros sink/BCOM ) with zero PTSs.
I'll check this... That code has more issues, like using the RTP timestamp, which usually wraps around (it's uint32), so using it as a monotonically increasing value is incorrect, that was changed during the update to M120...
FWIW, I tested the reverted patch here with two different (upstream) decoders (avdec_h264, vah264dec) and playback freezes at the first frame. So I suspect your decoder has some issues in its handling of timestamps.
Anyway, I'm preparing a patch fixing use of RTP timestamps, that is definitely incorrect.
Hm, I see the RTP timestamps are used in the "generic_decoder" to track encoded/decoded frames timing informations. I think we can store this in a custom GstMeta on GStreamer side...
With that change we let GStreamer timestamp the buffers... The RTP timestamps are stored in a meta and retrieved before providing the "decoded" (parsed in reality) frames to libwebrtc, because it needs that information.
Thanks @philn we'll try your changes and let you know!
@philn we tested the patch in https://github.com/philn/WebKit/commit/4c0cb2f8bfb693a4c4c5de3a9af46b5760f5e22d it works well for us. The logs are attached in wpewebkit-libwebrtc.log
0:01:23.185686736 38 0xece6d2c0 LOG webkitlibwebrtcvideodecoder GStreamerVideoDecoderFactory.cpp:246:pullSample:<H264_dec_pipeline_0xece1ebe8> Output decoded frame with RTP timestamp 1600788857 -> buffer: 0xc320a000, pts 0:00:01.927712516, dts
0:01:23.220430529 38 0xece6d2c0 LOG webkitlibwebrtcvideodecoder GStreamerVideoDecoderFactory.cpp:246:pullSample:<H264_dec_pipeline_0xece1ebe8> Output decoded frame with RTP timestamp 1600791917 -> buffer: 0xed80c0a8, pts 0:00:01.962215161, dts
The patch was allstreamed... upstream, downstream! Can we close this issue then?
the patch is in 33365d6d8312c02454d9774652aa06f5eb769821 closing this issue. Thanks @philn
the following commit https://github.com/WebPlatformForEmbedded/WPEWebKit/commit/07d5b35f4a98599385d3751c975767eed9f00c26 is preventing us from using frames with PTS/renderTime=0 as intended here: https://github.com/WebPlatformForEmbedded/WPEWebKit/blob/wpe-2.38/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/timing/timing.cc#L200
PTS=0 is needed when we want the playback pipeline to decode & render the frame ASAP without waiting