homebridge-plugins / homebridge-camera-ffmpeg

Homebridge Plugin Providing FFmpeg-based Camera Support
https://homebridge-plugins.github.io/homebridge-camera-ffmpeg/
Apache License 2.0
1.09k stars 227 forks source link

No audio #9

Closed StSimmons closed 4 years ago

StSimmons commented 8 years ago

There appears ton be no audio sent to HomeKit.

My setup (on OSX) creates a stream with these parameters: -re -f avfoundation -video_size 1280x720 -framerate 30 -i 0:3 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params XCRzu2fUEo57aBQkoylhjm413JUwnrUP47BQU7yD srtp://192.168.2.57:55484?rtcpport=55484&localrtcpport=55484&pkt_size=1378

Pushing to an mp4 file does have audio, yet the home bridge stream does not.

KhaosT commented 8 years ago

Yeah, this is expected for now... I cannot figure out a way to force ffmpeg to output audio rtp stream at designated ptime. If you know any way to do that, let me know and I'll add audio support.

StSimmons commented 8 years ago

I have a friend who is a regular contributor to ffmpeg, could I ask you to explain that a little more verbosely and I'll fire him a message?

KhaosT commented 8 years ago

@StSimmons That would be nice. Basically in RTP, there is a packet time to control the length of time in milliseconds represented by the media in a packet. HomeKit expects the ptime to be 20ms but so far in all my experiments it seems ffmpeg output packets at random times and iOS rejects those packets.

Here is the output parameters I'm currently using for audio experiment. -c:a libopus -vn -ac 1 -ar 16k -b:a 16k -vbr on -payload_type 110 -ssrc 2 -f rtp "rtp://127.0.0.1:5600"

StSimmons commented 8 years ago

He mentioned that those parameters should lead to 20ms packets from opus. However, that isn't necessary 1:1 with RTP packetisation. Specifically, he mentioned to try tweaking lavf/rtpenc.c to set max_frames_per_packet = 1 for opus

KhaosT commented 8 years ago

@StSimmons thanks... ffmpeg is just really strange... for iSight camera, if I put -f avfoundation -r 29.97 -i 0:0 as my source and with the modified version I can get audio stream working fine but now video stream basically becomes unviewable. If I use -re -f avfoundation -r 29.97 -i 0:0, the video stream will work really smoothly but audio stream just stops working.

You can try the modified version if you want but I'm not sure what else I can tweak to get this to work...

julianfez commented 8 years ago

I have found some info Related to stream audio an video https://www.wowza.com/forums/content.php?213-How-to-use-FFmpeg-with-Wowza-Media-Server-(MPEG-TS)

albertodig commented 7 years ago

Any news to solve audio problem? Project is very interesting.. congratulation guys!

julianfez commented 7 years ago

I been playing with ffmpeg i got the audio with video and i been trying different presets , at the beginning audio and video choppy and i fix the audio but the video still choppy , but i take out the audio , just the video its really smooth a fast any ideas?

KhaosT commented 7 years ago

I think iOS tries to perform some kind of video/audio syncing and the way ffmpeg sent the frame messed it up...

Best Wishes, Tian

On Nov 19, 2016, at 6:32 PM, julianfez notifications@github.com wrote:

I been playing with ffmpeg i got the audio with video and i been trying different presets , at the beginning audio and video choppy and i fix the audio but the video still choppy , but i take out the audio , just the video its really smooth a fast any ideas?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/KhaosT/homebridge-camera-ffmpeg/issues/9#issuecomment-261754189, or mute the thread https://github.com/notifications/unsubscribe-auth/ABpU4FGnT9oJVdqWpnP0oirAauAgvhJWks5q_7E5gaJpZM4KMYQw.

julianfez commented 7 years ago

finally i just fix the audio with libfdk , i cant with libopus still choppy

KhaosT commented 7 years ago

Nice! Mind to share the process?

Sent from my iPhone

On Nov 23, 2016, at 7:06 AM, julianfez notifications@github.com wrote:

finally i just fix the audio with libfdk , i cant with libopus still choppy

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

julianfez commented 7 years ago

I just have change the ptk size on video and audio with the video I seems that load quickly with 1316 and with the audio to 188 because the libfdk make a sound like a wave , For me just work better this config anyone can play with different video codecs like libx264 and rawvideo I have h264 Let me know how works for you

'use strict';
var uuid, Service, Characteristic, StreamController;

var fs = require('fs');
var ip = require('ip');
var spawn = require('child_process').spawn;

module.exports = {
  FFMPEG: FFMPEG
};

function FFMPEG(hap, ffmpegOpt) {
  uuid = hap.uuid;
  Service = hap.Service;
  Characteristic = hap.Characteristic;
  StreamController = hap.StreamController;

  if (!ffmpegOpt.source) {
    throw new Error("Missing source for camera.");
  }

  this.ffmpegSource = ffmpegOpt.source;
  this.ffmpegImageSource = ffmpegOpt.stillImageSource;

  this.services = [];
  this.streamControllers = [];

  this.pendingSessions = {};
  this.ongoingSessions = {};

  var numberOfStreams = ffmpegOpt.maxStreams || 2;
  var videoResolutions = [];

  var maxWidth = ffmpegOpt.maxWidth;
  var maxHeight = ffmpegOpt.maxHeight;
  var maxFPS = (ffmpegOpt.maxFPS > 30) ? 30 : ffmpegOpt.maxFPS;

  if (maxWidth <= 320) {
    if (maxHeight <= 240) {
      videoResolutions.push([320, 240, maxFPS]);
      if (maxFPS > 15) {
        videoResolutions.push([320, 240, 15]);
      }
    }

    if (maxHeight <= 180) {
      videoResolutions.push([320, 180, maxFPS]);
      if (maxFPS > 15) {
        videoResolutions.push([320, 180, 15]);
      }
    }
  }

  if (maxWidth <= 480) {
    if (maxHeight <= 360) {
      videoResolutions.push([480, 360, maxFPS]);
    }

    if (maxHeight <= 270) {
      videoResolutions.push([480, 270, maxFPS]);
    }
  }

  if (maxWidth <= 640) {
    if (maxHeight <= 480) {
      videoResolutions.push([640, 480, maxFPS]);
    }

    if (maxHeight <= 360) {
      videoResolutions.push([640, 360, maxFPS]);
    }
  }

  if (maxWidth <= 1280) {
    if (maxHeight <= 960) {
      videoResolutions.push([640, 360, maxFPS]);
    }

    if (maxHeight <= 720) {
      videoResolutions.push([1280, 720, maxFPS]);
    }
  }

  if (maxWidth <= 1920) {
    if (maxHeight <= 1080) {
      videoResolutions.push([1280, 720, maxFPS]);
    }
  }

  let options = {
    proxy: false, // Requires RTP/RTCP MUX Proxy
    srtp: true, // Supports SRTP AES_CM_128_HMAC_SHA1_80 encryption
    video: {
      resolutions: [
        [320, 240, 15], // Apple Watch requires this configuration
        [1280, 960, 30],
        [1280, 720, 30],
        [1024, 768, 30],
        [640, 480, 30],
        [640, 360, 30],
        [480, 360, 30],
        [480, 270, 30],
        [320, 240, 30],
        [320, 180, 30]
        ],
      codec: {
        profiles: [2], // Enum, please refer StreamController.VideoCodecParamProfileIDTypes
        levels: [2] // Enum, please refer StreamController.VideoCodecParamLevelTypes
      }
    },
    audio: {
      comfort_noise: true,
      codecs: [
        {
          type: "AAC-eld",
          samplerate: 16
        }
      ]
    }
  }

  this.createCameraControlService();
  this._createStreamControllers(numberOfStreams, options); 
}

FFMPEG.prototype.handleCloseConnection = function(connectionID) {
  this.streamControllers.forEach(function(controller) {
    controller.handleCloseConnection(connectionID);
  });
}

FFMPEG.prototype.handleSnapshotRequest = function(request, callback) {
  let resolution = request.width + 'x' + request.height;
  var imageSource = this.ffmpegImageSource !== undefined ? this.ffmpegImageSource : this.ffmpegSource;
  let ffmpeg = spawn('ffmpeg', (imageSource + ' -t 1 -s '+ resolution + ' -f image2 -').split(' '), {env: process.env});
  var imageBuffer = Buffer(0);
  this.handleStreamRequest(request);
  var sessionID = request["sessionID"];

  ffmpeg.stdout.on('data', function(data) {
    imageBuffer = Buffer.concat([imageBuffer, data]);
  });
  ffmpeg.on('close', function(code) {
    callback(undefined, imageBuffer);
  });
}

FFMPEG.prototype.prepareStream = function(request, callback) {
  var sessionInfo = {};

  let sessionID = request["sessionID"];
  let targetAddress = request["targetAddress"];

  sessionInfo["address"] = targetAddress;

  var response = {};

  let videoInfo = request["video"];
  if (videoInfo) {
    let targetPort = videoInfo["port"];
    let srtp_key = videoInfo["srtp_key"];
    let srtp_salt = videoInfo["srtp_salt"];

    let videoResp = {
      port: targetPort,
      ssrc: 1,
      srtp_key: srtp_key,
      srtp_salt: srtp_salt
    };

    response["video"] = videoResp;

    sessionInfo["video_port"] = targetPort;
    sessionInfo["video_srtp"] = Buffer.concat([srtp_key, srtp_salt]);
    sessionInfo["video_ssrc"] = 1; 
  }

  let audioInfo = request["audio"];
  if (audioInfo) {
    let targetPort = audioInfo["port"];
    let srtp_key = audioInfo["srtp_key"];
    let srtp_salt = audioInfo["srtp_salt"];

    let audioResp = {
      port: targetPort,
      ssrc: 2,
      srtp_key: srtp_key,
      srtp_salt: srtp_salt
    };

    response["audio"] = audioResp;

    sessionInfo["audio_port"] = targetPort;
    sessionInfo["audio_srtp"] = Buffer.concat([srtp_key, srtp_salt]);
    sessionInfo["audio_ssrc"] = 2; 
  }

  let currentAddress = ip.address();
  var addressResp = {
    address: currentAddress
  };

  if (ip.isV4Format(currentAddress)) {
    addressResp["type"] = "v4";
  } else {
    addressResp["type"] = "v6";
  }

  response["address"] = addressResp;
  this.pendingSessions[uuid.unparse(sessionID)] = sessionInfo;

  callback(response);
}

FFMPEG.prototype.handleStreamRequest = function(request) {
  var sessionID = request["sessionID"];
  var requestType = request["type"];
  if (sessionID) {
    let sessionIdentifier = uuid.unparse(sessionID);

    if (requestType == "start") {
      var sessionInfo = this.pendingSessions[sessionIdentifier];
      if (sessionInfo) {
        var width = 1280;
        var height = 720;
        var fps = 30;
        var bitrate = 600;

        let videoInfo = request["video"];
        if (videoInfo) {
          width = videoInfo["width"];
          height = videoInfo["height"];

          let expectedFPS = videoInfo["fps"];
          if (expectedFPS < fps) {
            fps = expectedFPS;
          }

          bitrate = videoInfo["max_bit_rate"];
        }

        let targetAddress = sessionInfo["address"];
        let targetVideoPort = sessionInfo["video_port"];
        let videoKey = sessionInfo["video_srtp"];

        let targetAudioPort = sessionInfo["audio_port"];
        let audioKey = sessionInfo["audio_srtp"];

        let ffmpegCommand = this.ffmpegSource + ' -threads 0 -vcodec h264 -an -r '+ fps +' -g '+ fps +' -f h264 -coder 0 -tune zerolatency -crf 18 -vf scale='+ width +':'+ height +' -bt 1M -b:v '+ bitrate +'k -bufsize '+ bitrate +'k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+videoKey.toString('base64')+' -rtsp_transport tcp srtp://'+targetAddress+':'+targetVideoPort+'?rtcpport='+targetVideoPort+'&localrtcpport='+targetVideoPort+'&pkt_size=1316';
       ffmpegCommand += ' -c:a libfdk_aac -profile:a aac_eld -vn -ac 1 -ar 16000 -b:a 8000 -flags +global_header -payload_type 110 -ssrc 2 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+audioKey.toString('base64')+' -rtsp_transport tcp srtp://'+targetAddress+':'+targetAudioPort+'?rtcpport='+targetAudioPort+'&localrtcpport='+targetAudioPort+'&pkt_size=188';

        console.log(ffmpegCommand, videoInfo);
        let ffmpeg = spawn('ffmpeg', ffmpegCommand.split(' '), {env: process.env});
        this.ongoingSessions[sessionIdentifier] = ffmpeg;
      }

      delete this.pendingSessions[sessionIdentifier];
    } else if (requestType == "stop") {
      var ffmpegProcess = this.ongoingSessions[sessionIdentifier];
      if (ffmpegProcess) {
        ffmpegProcess.kill('SIGKILL');
      }

      delete this.ongoingSessions[sessionIdentifier];
    }
  }
}

FFMPEG.prototype.createCameraControlService = function() {
  var controlService = new Service.CameraControl();

  this.services.push(controlService);
}

// Private

FFMPEG.prototype._createStreamControllers = function(maxStreams, options) {
  let self = this;

  for (var i = 0; i < maxStreams; i++) {
    var streamController = new StreamController(i, options, self);

    self.services.push(streamController.service);
    self.streamControllers.push(streamController);
  }
}
julianfez commented 7 years ago

For me with this config even the latency it's short like 2 or 3 seconds and smooth

ghost commented 7 years ago

Are there any dependencies for this other than ffmpeg with libfdk-aac? Using this config I'm not getting any video at all, but the repo config works fine

benzman81 commented 7 years ago

I am also having no audio. Any step by step guide I can follow to achieve this?

KhaosT commented 7 years ago

Sorry, audio is still not supported right now.

Sent from my iPhone

On Dec 31, 2016, at 1:23 PM, Markus Krüger notifications@github.com wrote:

I am also having no audio. Any step by step guide I can follow to achieve this?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

KhaosT commented 7 years ago

@tstiman there is an "edit" feature on GitHub... you don't need to delete and re-post your reply for three times... For your problem, HomeKit is really picky on the audio stream, it needs to match exactly as what it selected. In your original code, you have two audio codecs (OPUS and AAC-ELD), I think iOS will always pick OPUS. However, in your code, you are sending AAC stream (not even AAC-ELD), that will not work. To have sound working, you need to send the proper type and follow the ptime HomeKit asked in the request.

KhaosT commented 7 years ago

You'll have to check iOS side for errors. If you are on macOS Sierra, open Console app, pick your iPhone from the sidebar, in the filter, pick to only show logs from avconferenced and then start the stream.

julianfez commented 7 years ago

so far I have trying to work better the audio here is what I did working with type: "AAC-eld" samplerate: 16

. I have change the audio req. and res. ssrc = 2 both, I been having some issues with video , get freeze and play audio , so the bitrate was on 0 as default so I setup the bitrate adding this

let audioInfo = request["audio"]; audioInfo["bit_rate"] = 16;

like this : if (requestType == "start") { var sessionInfo = this.pendingSessions[sessionIdentifier]; if (sessionInfo) { var width = 1280; var height = 720; var fps = 30; var bitrate = 300; let audioInfo = request["audio"]; audioInfo["bit_rate"] = 16;

and also the audio port and key next to the video let targetAddress = sessionInfo["address"]; let targetVideoPort = sessionInfo["video_port"]; let videoKey = sessionInfo["video_srtp"];

let targetAudioPort = sessionInfo["audio_port"]; let audioKey = sessionInfo["audio_srtp"];

this is the video command line like I have 5 deferents brand I don't remove the pixel fmt and ffmpeg find it auto let ffmpegCommand = this.ffmpegSource + ' -threads 0 -c:v h264 -an -r '+ fps +' -g '+ fps +' -f h264 -coder 0 -tune zerolatency -crf 18 -vf scale='+ width +':'+ height +' -bt 1M -b:v '+ bitrate +'k -bufsize '+ bitrate +'k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+videoKey.toString('base64')+' -rtsp_transport tcp srtp://'+targetAddress+':'+targetVideoPort+'?rtcpport='+targetVideoPort+'&localrtcpport='+targetVideoPort+'&pkt_size=1316';

and the audio source its this have to start the profile audio low and final to eld to be able stream a decent sound

            ffmpegCommand += ' -vn -profile aac_low -ar 16000 -c:a libfdk_aac -profile:a aac_eld -ab 16k -flags +global_header -sample_fmt s16 -f flac -payload_type 110 -ssrc 2 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+audioKey.toString('base64')+' -rtsp_transport tcp srtp://'+targetAddress+':'+targetAudioPort+'?rtcpport='+targetAudioPort+'&localrtcpport='+targetAudioPort+'&pkt_size=188';

this is working on a pc Ubuntu I have deferents result with the pi and a intel Edison don't know why I have the ffmpeg compile setup

if anyone can confirm that this is working

tstiman commented 7 years ago

how to achieve two way audio?

julianfez commented 7 years ago

get a HomeKit camera

damiandudycz commented 7 years ago

I managed to get sound working on Raspberry Pi 2 with libfdk_aac, but when sound is added, then video is chopping. I tried modifying bitrates and other values in script from julianfez, but nothing helps. This is the output I get from homebridge: -re -f v4l2 -framerate 5 -video_size 640x360 -i /dev/video0 -f alsa -i plughw:1,0 -threads 0 -vcodec h264_omx -an -pix_fmt yuv420p -r 15 -f h264_omx -vf scale=640:360 -b:v 132k -bufsize 132k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params 2NBLKWWEkmR2Sw5bUtj3kmjMDE3BNxCXTIEvKNaX srtp://192.168.3.11:53547?rtcpport=53547&localrtcpport=53547&pkt_size=1361 -c:a libfdk_aac -profile:a aac_eld -vn -ac 1 -ar 16000 -b:a 8000 -flags +global_header -payload_type 110 -ssrc 2 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params WXcz+Dj18Kh/eJI+/49Mw4arg1j0O0fnDNRcLyj1 -rtsp_transport tcp srtp://192.168.3.11:53941?rtcpport=53941&localrtcpport=53941&pkt_size=188 Can someone help me?

AdySan commented 7 years ago

So just to be sure, since this issue is still open,, is audio reliably working for anyone? FWIW, using a Ubiquiti Unifi Micro camera here.

DanielWeeber commented 7 years ago

Don't think so, im also very interested

llemtt commented 7 years ago

Anyone already successfully tried a two-way "audio only" connection?

charlesmkelley commented 6 years ago

Any update on this?

zllovesuki commented 6 years ago

Video and audio streams are separated, as pointed out here: https://github.com/KhaosT/HAP-NodeJS/wiki/IP-Camera

Thus, you need to use two different ffmpeg processes to stream video and audio. I have a fork that is specially optimized for Raspberry Pi + C920: https://github.com/zllovesuki/homebridge-c920-pi

You will need to compile ffmpeg with libfdk-aac support. Otherwise, they are exactly the same.

This is the command generated by the script:

Feb 10 14:58:57 raspberrypi homebridge[32180]: -thread_queue_size 512 -ar 32000 -ac 2 -f alsa -i hw:1,0 -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -b:a 24k -r:a 16000 -bufsize 48k -ac 1 -payload_type 110 -ssrc 15825701 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params pyTZTvOeLBFIX65gURV2GHVc9mieWqt4alGhPdNU srtp://192.168.1.90:64874?rtcpport=64874&localrtcpport=64874

As you can see here, the audio stream is coming from the USB webcam, and the audio codec is libfdk_aac with a profile of eld (Enhanced Low Delay, which is the aac-eld codec that Apple expects). You need to use -flags +global_header as pointed out here or you will have error complained by libafk_aac.


RPi Stuff

You can have mmal (hardware _de_coding) and omx (hardware _en_coding) on your RPi if you compiled your ffmpeg with the said support. Technically you can decode H264 stream from the webcam (it saves the bandwidth on your USB bus, as opposed to the rawvideo stream), and re-encode it to H264. However, h264_omx codec doesn't have much fancy settings other than resizing...

zllovesuki commented 6 years ago

Sorry... Kept on hijacking the discussion... I found a good way to stream video and audio.

multi stream

Since my house has multiple people, I want the plugin to be able to handle multiple stream requests. Thus, we will have v4l2rtspserver grab the video feed, and stream it locally. Then, whenever users request for a live feed, it will grab from the local stream instead of taking control of the device.

Follow this guide and compile v4l2rtspserver.

ffmpeg

C920 has H264 compressed stream support, so we could technically decode and re-encode (all HW accelerated).

However, the default ffmpeg was not compiled with any of those support...

  1. Refers to the first section of this gist for libfdk_aac support
  2. Refers to this link on Reddit for omx and mmal support, or else your RPi will be on fire and disintegrate.
  3. You also need to include --enable-network --enable-protocol=tcp --enable-demuxer=rtsp --enable-decoder=h264 when you are compiling or else you won't be able to use rtsp stream from v4l2rtspserver as an input.

setup

For all intent and purposes, 720p stream is good enough. Load module bcm2385-v4l2 with sudo modprobe bcm2385-v4l2, and make it load on boot with sudo echo "bcm2385-v4l2" >> /etc/modules

Then set the camera native resolution and stream format with /usr/bin/v4l2-ctl --set-fmt-video=width=1280,height=720,pixelformat=1 -d /dev/video0

v4l2server: v4l2rtspserver -c -Q 512 -s -F 0 -H 720 -W 1280 -P 8555 -A 32000 -C 2 /dev/video0,hw:1,0

Or you can use systemd:

[Unit]
Description=Streaming Server
Before=homebridge.service
After=syslog.target network-online.target

[Service]
Type=simple
User=pi
ExecStartPre=/usr/bin/v4l2-ctl --set-fmt-video=width=1280,height=720,pixelformat=1 -d /dev/video0
ExecStart=/path/to/v4l2rtspserver -c -Q 512 -s -F 0 -H 720 -W 1280 -I 127.0.0.1 -P 8555 -A 32000 -C 2 /dev/video0,hw:1,0
Restart=on-failure
RestartSec=10
KillMode=process

[Install]
WantedBy=multi-user.target

changes to the code

Go to the plugin's root folder, and modify ffmpeg.js. ffmpegCommand should now be:

let ffmpegCommand = '-thread_queue_size 512 ' + this.ffmpegSource + ' -map 0:0 -vcodec h264_omx -r ' +
    fps + ' -vf scale=' + width + ':' + height + ' -b:v ' + bitrate + 'k -bufsize ' +
    bitrate + 'k -payload_type 99 -ssrc ' + videoSsrc + ' -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params ' +
    videoKey.toString('base64') + ' srtp://' + targetAddress + ':' + targetVideoPort + '?rtcpport=' + targetVideoPort +
    '&localrtcpport=' + targetVideoPort + '&pkt_size=1378 ' + 
    '-map 0:1 -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -b:a 24k -r:a 16000 -bufsize 48k -ac 1 ' +
    '-payload_type 110 -ssrc ' + audioSsrc + ' -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params ' +
    audioKey.toString('base64') + ' srtp://' + targetAddress + ':' + targetAudioPort + '?rtcpport=' + targetAudioPort +
    '&localrtcpport=' + targetAudioPort;

We have -map to separate the video and audio streams... Refer to my previous comment on libfdk_aac.

configuration

In videoConfig, point source to the v4l2rtspserver. ffmpeg should be able to demux. It has to look something like this: -f rtsp -vcodec h264_mmal -i rtsp://127.0.0.1:8555/unicast. We are using h264_mmal for hardware decoding so your RPi can handle multiple live stream requests.

Don't forget to set vcodec to h264_omx.

caveat

  1. Using RTSP + h264_mmal will increase the time to initialize the stream. In my case it takes 20 seconds for the video to start playing (as opposed to almost instantly). I'm still trying to figure out the way.
  2. RTSP does complain with max delay reached. need to consume packet. Still trying to investigate
  3. Depending on the movement of the stars, the direction of the wind, and the amount of water your drank, RTP might complained with Non-monotonous DTS in output stream 1:0; previous: 605510, current: 559023; changing to 605510. This may result in incorrect timestamps in the output file. I'm still trying to investigate that.
link266 commented 5 years ago

Wish I could follow all that... :)

Anyone have a link to instructions for how to include audio streaming from the Homebridge plugin? I've currently got video only streaming from my UniFi Video system to HomeKit but sadly no audio. I'm willing to try to figure out how to "recompile ffmpeg" as it sounds like that's what I need to do. I'm running Homebridge from macOS and I think that makes a difference.

Any ideas? Not asking for anyone to write up a step-by-step but if someone can point me in the right direction I'd sure appreciate it.

github-actions[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.