Open DearTan opened 10 months ago
For synchronization purposes, you have to look at the timestamp of each frame, not just the first ones
For synchronization purposes, you have to look at the timestamp of each frame, not just the first ones
Why is the grabber.getFormatContext().start_time() not equals to the first frame's timestamp?
If you're expecting them to be equal, that's a bug in your muxer that created the stream.
If you're expecting them to be equal, that's a bug in your muxer that created the stream.
How to solve this problem?Do I need to set any parameters?
This might be fixed with pull #2144
Please try again with the snapshots: http://bytedeco.org/builds/
@DearTan, there is still very little information to draw any conclusion. Please specify the details of the situation in which the problem occurs. For example, the origin of the video file - did you create it yourself (e.g. is it the result of cutting or otherwise editing another video with ffmpeg), or is it a file you got from somewhere else. Are you observing the problem on a particular file (while the others are fine), or is the problem systematically occurring with many/all files? Is the desynchronization observed when playing in different video players, or have you developed your own player in which this is observed. Finally, about the desynchronization itself: is it a constant shift between video and audio, or different playback speeds (you write that audio plays slower, i.e. desynchronization builds up over time - is that correct?).
The timestamp of the first frame of audio and video are both 0. But when the audio and video are played, the audio played slower than the video. How to solve the audio and video synchronization?