shinyoshiaki / werift-webrtc

WebRTC Implementation for TypeScript (Node.js), includes ICE/DTLS/SCTP/RTP/SRTP/WEBM/MP4
MIT License
463 stars 29 forks source link

RTCVideoSource? #342

Open LeviPesin opened 10 months ago

LeviPesin commented 10 months ago

Is there an equivalent or a similar thing to wrtc's RTCVideoSource? My use-case is creating a MediaStreamTrack from per-frame images.

shinyoshiaki commented 10 months ago

unexist

sunpeng222 commented 9 months ago

Is there any other way to obtain data for each frame? The data length of the RTP obtained by onReceiveRtp is inconsistent

koush commented 9 months ago

You need to pipe it into ffmpeg or some other decoder. That's what I do.

sunpeng222 commented 9 months ago

您需要将其通过管道传输到 ffmpeg 或其他解码器中。我就是做这个的。

May I ask how real-time it is? Is the delay significant?

koush commented 9 months ago

It is realtime with the proper args (analyzeduration 0 and small probesize )

sunpeng222 commented 9 months ago

它是实时的,具有适当的参数(分析持续时间 0 和小探测大小)

Is it necessary to use H264RtpPayload to process the received rtp data?I'm not sure what this does Can you explain how your code works?I have received the RTP data, but I don't know how to process it into individual frames. The lengths of the received RTP data are all different. e.track.onReceiveRtp.subscribe(async (rtp) => { const h264 = H264RtpPayload.deSerialize(rtp.payload); console.log(h264) })

sunpeng222 commented 9 months ago

It is realtime with the proper args (analyzeduration 0 and small probesize )

I used ffmpeg to decode the H264 rtp data and save it locally, but the still images in the video file appear gray, and only when the objects in the image move do they appear partially colored.How can I solve this problem? I don't know if the parameters of ffmpeg are set correctly image image

LeviPesin commented 7 months ago

unexist

So... is there some workaround? I'm not sure how exactly to use ffmpeg to get a MediaStreamTrack from images.