Closed ZeoWorks closed 1 year ago
Thank you! Would it be possible to pipe the feed directly to WebRTC for the lowest latency possible without the need of rtmp? (Even UDP would be great).
I've seen this user somewhat achieved this result, however it doesn't work for hardware encoded h264; https://github.com/wakabayashik/mpegts-to-webrtc
Hi ZeoWorks, my solution was to use ffmpeg's nut format which behaves nicely with pipes.
My ffmpeg command:
ffmpeg -f v4l2 -input_format h264 -video_size 1280x720 -i /dev/video0 -c:v copy -f nut pipe:1
Reading a frame/sample:
func readNutSample(ndx *Demuxer, buf []byte) (int, error) {
for {
e, err := ndx.ReadEvent()
if err != nil {
return 0, err
}
if e.Type() == FrameEvent {
n, err := e.(Frame).Data().Read(buf)
return n, err
}
}
}
Writing to track:
ndx := NewDemuxer(stdout) // ffmpeg nut demuxer
buf := make([]byte, 0x100000) // 1 MiB
buf[3] = 1 // Annex B prefix
for {
n, _ := readNutSample(ndx, buf[4:])
sample := media.Sample{Data: buf[:n+4], Duration: time.Second}
track.WriteSample(sample)
}
To keep it short I didn't include error handling. I'm not sure if setting Duration to time.Second is correct, but you know what they say: "The best way to get a correct answer is to post an incorret one"
golang nut demuxer: https://github.com/retailnext/gonut
Hi ZeoWorks, my solution was to use ffmpeg's nut format which behaves nicely with pipes.
My ffmpeg command:
ffmpeg -f v4l2 -input_format h264 -video_size 1280x720 -i /dev/video0 -c:v copy -f nut pipe:1
Reading a frame/sample:
func readNutSample(ndx *Demuxer, buf []byte) (int, error) { for { e, err := ndx.ReadEvent() if err != nil { return 0, err } if e.Type() == FrameEvent { n, err := e.(Frame).Data().Read(buf) return n, err } } }
Writing to track:
ndx := NewDemuxer(stdout) // ffmpeg nut demuxer buf := make([]byte, 0x100000) // 1 MiB buf[3] = 1 // Annex B prefix for { n, _ := readNutSample(ndx, buf[4:]) sample := media.Sample{Data: buf[:n+4], Duration: time.Second} track.WriteSample(sample) }
To keep it short I didn't include error handling. I'm not sure if setting Duration to time.Second is correct, but you know what they say: "The best way to get a correct answer is to post an incorret one"
golang nut demuxer: https://github.com/retailnext/gonut
Hi Chalky, thank you for this! Is it possible to use hardware encoding with ffmpeg prior to pipe instead of -c:v copy? (Example; h264_amf / h264_qsv / h264_nvenc)?
Hi Chalky, thank you for this! Is it possible to use hardware encoding with ffmpeg prior to pipe instead of -c:v copy? (Example; h264_amf / h264_qsv / h264_nvenc)?
I tested it also with h264_qsv, but not extensively.
Hi Chalky, thank you for this! Is it possible to use hardware encoding with ffmpeg prior to pipe instead of -c:v copy? (Example; h264_amf / h264_qsv / h264_nvenc)?
I tested it also with h264_qsv, but not extensively.
Thank you! After piping it, does webrtc re-encode the feed? Or is it just a passthrough of the already encoded feed?
Thank you! After piping it, does webrtc re-encode the feed? Or is it just a passthrough of the already encoded feed?
No re-encoding, CPU had about 3% load.
Hi, is there any examples as how to pipe ffmpeg to webrtc?