It provides guidance to test live streaming (mostly RTMP, SRT, mpeg-dash or hls) or vod from your own desktop using FFmpeg, it's pretty useful for testing and learning purposes.
Tested with:
- MacOS High Siera 10.13, 10.15.2, Ubuntu 18.04
- Warning: The video asset used for looping streaming is more than hundreds of MBs.
docker
wget
curl
Run this server in one of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_http_server.sh | sh
Run this encoder in another of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_hls_abr_live_stream.sh | sh
Access the stream at http://localhost:8080/master.m3u8 or at clappr's demo page
If you want to use a video file instead of a synthetic media, run this encoder in another of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_hls_abr_live_stream_file_loop.sh | sh
Run this server in one of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_http_server.sh | sh
Run this encoder in another of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_dash_abr_live_stream.sh | sh
Access the stream at http://localhost:8080/out.mpd or go to dashjs's demo page
Run this server in one of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_http_server.sh | sh
Run this encoder in another of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_hls_low_latency_live_stream.sh | sh
Access the stream at http://localhost:8080/stream.m3u8 or at clappr's demo page
Run this server in one of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_http_server.sh | sh
Run this encoder in another of your tabs:
curl -s https://raw.githubusercontent.com/leandromoreira/live-stream-from-desktop/master/start_mpeg_dash_low_latency_live_stream.sh | sh
Access the stream at http://localhost:8080/stream.mpd
# I assume you have brew already
# or you could use curl
brew install wget
brew install ffmpeg
brew install node
# the http server
npm install http-server -g
# WARNING IT IS A HUGE download file (263M)
wget -O bunny_1080p_30fps.mp4 http://distribution.bbb3d.renderfarming.net/video/mp4/bbb_sunflower_1080p_30fps_normal.mp4
From a pseudo FFmpeg video source color bar and generated audio signal made of a sine wave with amplitude 1/8.
ffmpeg -hide_banner \
-re -f lavfi -i "testsrc2=size=1280x720:rate=30,format=yuv420p" \
-f lavfi -i "sine=frequency=1000:sample_rate=4800" \
-c:v libx264 -preset ultrafast -tune zerolatency -profile:v high \
-b:v 1400k -bufsize 2800k -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-c:a aac -b:a 128k -f flv rtmp://<HOST>:1935/live/<STREAM>
From a file.
ffmpeg -stream_loop -1 \
-re -i <YOUR_VIDEO>.mp4 -c:v libx264 \
-x264opts keyint=30:min-keyint=30:scenecut=-1 -tune zerolatency \
-s 1280x720 -b:v 1400k -bufsize 2800k \
-f flv rtmp://<HOST>:1935/live/<STREAM>
From a pseudo FFmpeg video source color bar and generated audio signal made of a sine wave with amplitude 1/8.
ffmpeg -hide_banner \
-re -f lavfi -i "testsrc2=size=1280x720:rate=30,format=yuv420p" \
-f lavfi -i "sine=frequency=1000:sample_rate=4800" \
-c:v libx264 -preset ultrafast -tune zerolatency -profile:v high \
-b:v 1400k -bufsize 2800k -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-c:a aac -b:a 128k -f flv rtmp://<HOST>:1935/live/<STREAM> \
-c:v libx264 -preset ultrafast -tune zerolatency -profile:v high \
-b:v 750k -bufsize 1500k -s 640x360 -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-c:a aac -b:a 128k -f flv rtmp://<HOST>:1935/live/<STREAM>
From a file.
ffmpeg -stream_loop -1 \
-re -i <YOUR_VIDEO>.mp4 -c:v libx264 \
-x264opts keyint=30:min-keyint=30:scenecut=-1 -tune zerolatency \
-s 1280x720 -b:v 1400k -bufsize 2800k \
-f flv rtmp://<HOST>:1935/live/<STREAM> \
-x264opts keyint=30:min-keyint=30:scenecut=-1 -tune zerolatency \
-s 640x360 -b:v 750k -bufsize 1500k \
-f flv rtmp://<HOST>:1935/live/<STREAM>
Open a terminal and run the ffmpeg command:
ffmpeg -stream_loop -1 -re -i bunny_1080p_30fps.mp4 \
-c:v libx264 -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-preset superfast -profile:v baseline -level 3.0 \
-tune zerolatency -s 1280x720 -b:v 1400k \
-bufsize 1400k -use_timeline 1 -use_template 1 \
-init_seg_name init-\$RepresentationID\$.mp4 \
-min_seg_duration 2000000 -media_seg_name test-\$RepresentationID\$-\$Number\$.mp4 \
-f dash stream.mpd
In another tab, run the following command to fire up the server:
http-server -a :: -p 8081 --cors -c-1
Now you can test this with your player (using the URL http://localhost:8081/stream.mpd
).
Open a terminal and run the ffmpeg command:
ffmpeg -stream_loop -1 -re -i bunny_1080p_30fps.mp4 -c:v libx264 \
-x264opts keyint=30:min-keyint=30:scenecut=-1 \
-tune zerolatency -s 1280x720 \
-b:v 1400k -bufsize 1400k \
-hls_start_number_source epoch -f hls stream.m3u8
In another tab, run the following command to fire up the server:
http-server -a :: -p 8081 --cors -c-1
Now you can test this with your player (using the URL http://localhost:8081/stream.m3u8
).
Open a terminal and run the ffmpeg command:
ffmpeg -re -pix_fmt uyvy422 -f avfoundation -i "0" -pix_fmt yuv420p \
-c:v libx264 -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-preset superfast -profile:v baseline -level 3.0 \
-tune zerolatency -s 1280x720 -b:v 1400k \
-bufsize 1400k -use_timeline 1 -use_template 1 \
-init_seg_name init-\$RepresentationID\$.mp4 \
-min_seg_duration 2000000 -media_seg_name test-\$RepresentationID\$-\$Number\$.mp4 \
-f dash stream.mpd
In another tab, run the following command to fire up the server:
http-server -a :: -p 8081 --cors -c-1
Now you can test this with your player (using the URL http://localhost:8081/stream.mpd
).
ffplay -f avfoundation -video_device_index 0 -audio_device_index 0 \
-pixel_format uyvy422 -framerate 30 -video_size 1280x720 -i "default"
Open a terminal and run the ffmpeg command:
ffmpeg -f avfoundation -video_device_index 0 -audio_device_index 0 \
-pixel_format uyvy422 -framerate 30 -video_size 640x480 -i "default" \
-c:v libx264 -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-tune zerolatency -b:v 1000k -bufsize 2000k \
-c:a aac -b:a 128k \
-hls_time 5 -hls_start_number_source epoch \
-f hls stream.m3u8
In another tab, run the following command to fire up the server:
http-server -a :: -p 8081 --cors -c-1
Now you can test this with your player (using the URL http://localhost:8081/stream.m3u8
).
Open a terminal and run the ffmpeg command:
ffmpeg -f avfoundation -video_device_index 0 -audio_device_index 0 \
-pixel_format uyvy422 -framerate 30 -video_size 640x480 -i "default" \
-c:v libx264 -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-tune zerolatency -b:v 600k -bufsize 1200k -preset superfast \
-c:a aac -b:a 128k \
-f flv rtmp://yourserver:1935/live/yourstream_key
#!/bin/bash
# this is necessary since ffmpeg stop listening once a player drops
while true
do
ffmpeg -hide_banner -loglevel verbose \
-re -f lavfi -i testsrc2=size=1280x720:rate=30,format=yuv420p \
-f lavfi -i sine=frequency=1000:sample_rate=44100 \
-c:v libx264 -preset veryfast -tune zerolatency -profile:v baseline \
-vf "drawtext=text='RTMP streaming':box=1:boxborderw=10:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=128:fontcolor=black" \
-b:v 1000k -bufsize 2000k -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-c:a aac -b:a 128k \
-f flv -listen 1 -rtmp_live live "rtmp://0.0.0.0:1935/live/app"
sleep 0.4
done
ffmpeg -hide_banner -loglevel verbose \
-re -f lavfi -i "testsrc2=size=1280x720:rate=30,format=yuv420p" \
-f lavfi -i "sine=frequency=1000:sample_rate=44100" \
-c:v libx264 -preset veryfast -tune zerolatency -profile:v baseline \
-b:v 1000k -bufsize 2000k -x264opts keyint=30:min-keyint=30:scenecut=-1 \
-f mpegts "srt://0.0.0.0:1234?mode=listener&smoother=live&transtype=live"
You can replace codecs, duration, text, and parameters (resolution, sample rate, presets, and so on) to attend your needs. Here's 10.5 seconds clip.
ffmpeg -y -hide_banner -loglevel verbose \
-f lavfi -i 'testsrc2=size=768x432:duration=10.5:rate=30,format=yuv420p' \
-f lavfi -i 'sine=frequency=1000:duration=10.5:sample_rate=44100' \
-c:v libx264 -preset veryfast -tune zerolatency -profile:v baseline \
-vf "drawtext=text='Sample Test H264/AAC':box=1:boxborderw=10:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=64:fontcolor=black" \
-b:v 1000k -bufsize 1000k -x264opts 'keyint=30:min-keyint=30:scenecut=-1' \
-c:a aac -b:a 128k \
sample_10.5s_h246_30fps_768x432_aac_44100.mp4
It generates video and audio from /dev/urandom device, it produces a stream that looks A LOT like an old analog TV noise.
In analog video and television, is a random dot pixel pattern of static displayed when no transmission signal is obtained by the antenna receiver
# if you mess with the -video_size you can have bigger/smaller noise patterns than this. :D
ffmpeg -f rawvideo -pixel_format rgb8 -video_size 640x360 \
-framerate 60 -i /dev/urandom \
-f u8 -ar 48100 -ac 1 -i /dev/urandom \
-sws_flags neighbor -s 640x360 urandom.mp4
# you can replace the binary by any large binary chunk of data (library...)
ffmpeg -f rawvideo -pixel_format rgb8 -video_size 32x23 \
-framerate 60 -i /usr/local/Cellar/ffmpeg/4.3_2/bin/ffmpeg \
-f u8 -ar 48100 -ac 1 -i /usr/local/Cellar/ffmpeg/4.3_2/bin/ffmpeg \
-sws_flags neighbor -s 640x360 -t 5s -pix_fmt yuv420p ffmpeg.mp4
ffplay -f lavfi -i mandelbrot=size=640x320:rate=60 \
-vf "drawtext = text = 'UTC %{gmtime}':fontsize=24:fontcolor=white:boxcolor=black@0.5:x=(w-text_w)/2:y=4:box=1:boxborderw=5"