Closed jheo4 closed 3 years ago
The side data of avpacket is not sent as a part of rtp frame. There is a need for another channel for tracking.
After the latency breakdown, I found sws_scale(struct SwsContext *c, const uint8_t *const *srcSlice, const int *srcStride, int srcSliceY, int srcSliceH, uint8_t *const *dst, const int *dstStride)
dominated the latency for streaming. sws_scale involves copying frame, converting pixel formats, and re-scaling. It must be addressed for practical RTP streaming.
Replaced swscaler with cvColor.
When the frame travels between the server and client via RTP, it is not easy to capture the e2e latency of each frame. It is because UDP-based RTP may drop some packets and miss frames, and it can happen on both sides: the server and the client. To address this issue, I have to be able to track each frame. Since the current use case is synchronized with mxre pipeline, I don't need to consider different frequencies between mxre and the application.
The following issues are commented in this thread.