This stream->start_time is pts and it should be devided by 1/time_base I think (should I just say multiply?),
and It's meaningless for raw bitstream input because the pts of every frame will be AV_NOPTS_VALUE, which is interpreted as -9223372036854775808, since stream->start_time is pts of first frame, this will send the calculated _AbsoluteTime straight to even before the beginning of the universe (based on the estimated age of the universe) (if it's multiplied by time_base then probably not😜).
The parsed FPS of raw bitstreams is inaccurate by the way, it's always 25, it has been like this since release 20230430, but it's accurate when I built with official ffmpeg branch release/5.1 last time.
The reason is likely ffmpeg changed the underlying mechanism of FPS parsing, a quick ffprobe test between recent git and 5.1.2 shows different fps value reported.
This stream->start_time is pts and it should be devided by 1/time_base I think (should I just say multiply?), and It's meaningless for raw bitstream input because the pts of every frame will be AV_NOPTS_VALUE, which is interpreted as -9223372036854775808, since stream->start_time is pts of first frame, this will send the calculated _AbsoluteTime straight to even before the beginning of the universe (based on the estimated age of the universe) (if it's multiplied by time_base then probably not😜).
The parsed FPS of raw bitstreams is inaccurate by the way, it's always 25, it has been like this since release 20230430, but it's accurate when I built with official ffmpeg branch release/5.1 last time. The reason is likely ffmpeg changed the underlying mechanism of FPS parsing, a quick ffprobe test between recent git and 5.1.2 shows different fps value reported.
(I only tested h264 and h265 bitstream)