Closed lightbody closed 4 years ago
Update: I just monkey-patched "-map 0:0" to "-map 0:1" and my Unifi cameras started working. I'm not sure how to solve this generically, but at least it confirms some of the issue.
I am having the exact same problem with my UniFi G3 cameras. What did you do exactly to make it work?
I documented my workaround here: https://www.reddit.com/r/Ubiquiti/comments/ad8faw/comment/ee3sqxk?st=JR520QXV&sh=a99b8579
I can confirm that also works for me.
The change in #263 fixes this for me when I pass "mapvideo": "0:1", "mapaudio": "0:0"
in the config file.
My streams still not working I have added "mapvideo": "0:1", "mapaudio": "0:0"
to my config.
{
"bridge": {
"name": "Homebridge",
"username": "CC:22:3D:E3:CE:30",
"port": 51826,
"pin": "031-45-154"
},
"description": "This is an example configuration file with one fake accessory and one fake platform. You can use this as a template for creating your own configuration file containing devices you actually own.",
"platforms": [
{
"platform": "Camera-ffmpeg",
"cameras": [{
"name": "Front Camera",
"motion": true,
"videoConfig": {
"source": "-rtsp_transport tcp -re -i rtsp://IP:7447/wdadawdw",
"stillImageSource": "-i http://IP/snap.jpeg",
"audio": true,
"maxStreams": 2,
"maxWidth": 1920,
"maxHeight": 1080,
"maxFPS": 15,
"mapvideo": "0:1",
"mapaudio": "0:0"
}
}]
}
]
}
I am in the same boat as MitchJackson94. The "mapvideo" and "mapaudio" are not fixing the live stream problem.
I get the still images every 12-15 secs fine but when I click on a camera in HomeKit, I get the error message "The camera is not responding". I do see this in the logs:
I have now got mine working I had to install ‘libfdk_acc’ manually as my ffmpeg did not include it
do I just
sudo apt-get install libfdk_aac
?
Just had a look at my config and thinking back I actually could not get libfdk_aac installed what I did to get it working was change the audio codec to "acodec": "libopus" in the config.json I have video but no audio
I tried your suggestion with adding "acodec" to the config.json and received an error on "unknown encoder: lib opus"
Also, I notice (captured in the screenshot as well) srtp with my AppleTV IP address is "opening an output file". Is that natural with part of the HomeKit integration?
And one last question: The snapshots my cmaera-ffmpeg are getting from my cameras are at 320x240. Is that default or can this be changed? I don't have the "stillImagesource" in my config and wonder if that is why.
Thanks for your help
What does your config look like? you could always try and install libfdk_aac it will also require you to build ffmpeg.
I think that's just the size that homekit uses the snapshot from the camera is 1920x1080.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
I have mostly Unfi G3 cameras around the house, but also one Doorbird at my front door. I've found that the Doorbird streams flawlessly, but all of my Unifi cameras won't stream on my iOS devices. As I dug into it, one key difference I found is that the RTSP stream for my Doorbird is video only while the Unifi streams are audio + video.
Here are the debug logs for the working DoorBird:
And here are the debug logs for the broken Unifi camera:
Based on the debug logs, it seems like what is happening is that this homebridge-camera-ffmpeg assumes the video is at always stream 0 and audio is at stream 1. In the case of my DoorBird, which has no audio (not sure why!), it is indeed at stream 0. But with my Unifi cameras, stream 0 is audio and stream 1 is video. These logs seem to suggest that it's getting mixed up.
I'm an ffmpeg noob big time, but it seems like the hardcoding of "-map 0:0" in the source might be part of the problem?