Closed Wkkkkk closed 1 year ago
Some example streams could be generated and played out with:
ffmpeg -f lavfi -i testsrc=size=640x360:rate=30 -f lavfi -i sine=frequency=440 -pix_fmt yuv420p -c:v libx264 -vprofile baseline -c:a aac -f mpegts "srt://127.0.0.1:1234?mode=listener"
gst-launch-1.0 srtsrc uri="srt://127.0.0.1:1234?mode=caller" ! queue ! decodebin name=d \
d. ! queue ! autovideosink \
d. ! queue ! audioconvert ! autoaudiosink
High
video profile is supported too.
To play out the stream, it is necessary to transcode it first.
## Generate SRT stream with ffmpeg
ffmpeg -f lavfi -i testsrc -pix_fmt yuv420p -c:v libx264 -vprofile baseline -f mpegts "srt://127.0.0.1:1234?mode=listener"
## Transcode the stream
gst-launch-1.0 srtsrc uri="srt://127.0.0.1:1234?mode=caller" ! queue ! decodebin ! queue ! videoconvert ! \
x264enc tune=zerolatency ! video/x-h264, profile=constrained-baseline ! \
mpegtsmux ! srtsink uri="srt://127.0.0.1:1235?mode=caller" wait-for-connection=false
## Connect
GST_DEBUG=1 cargo run --release -- -i 127.0.0.1:1235 -o 127.0.0.1:8888 -p 8000 -s listener | bunyan
## Play with WebRTC Player, URL:
http://localhost:8000/channel
We noticed an incredibly high computing overhead when using ffmpeg to generate an SRT stream. And it runs into errorav_interleaved_write_frame(): Input/output error
after 10 seconds.
We noticed that the SRT stream from ffmpeg can not be played out (no SDP offer generated from whipsink).