Open tidoust opened 3 years ago
The timestamp is already specified, however it is not widely implemented: https://w3c.github.io/webrtc-extensions/#rtcrtpcontributingsource-dictionary
Missing piece is how to synchronize tracks, maybe via: https://henbos.github.io/webrtc-timing/
There's a BBC R&D blog post that talks about synchronisation of WebRTC streams, which proposes using NMOS timing RTP header extensions to enable a synchronised playback mode.
For background info: Is there info on how the broadcasters do this today - eg do they keep camera clocks in sync, or do they adapt the media in some fashion?
Some discussion of synchronization here: https://w3c.github.io/mediacapture-main/getusermedia.html#introduction
Two mentions I can throw into the pot:
I have some comments there, but supporting more varied use cases like webrtc make me appreciate variety of situations that a good synchronized playback system needs be able to support.
How to keep multiple live feeds, e.g. from multiple cameras, in sync. Any way to embed timestamps in the streams?
Raised in: