Sunoo / homebridge-camera-ffmpeg

Homebridge Plugin Providing FFmpeg-based Camera Support
https://sunoo.github.io/homebridge-camera-ffmpeg/
Apache License 2.0
1.08k stars 225 forks source link

Making use of Motion and/or MotionEye for turning the camera into a motion sensor #1084

Closed AssetBurned closed 3 years ago

AssetBurned commented 3 years ago

Describe the solution you'd like: To my understanding this plugin requires an additional physical motion sensor to perform any actions. However there is an project called MotionEye (https://github.com/ccrisan/motioneye) that offers a web UI for the Motion (https://motion-project.github.io) that allows to interact with the camera stream and identifies any changes in the video stream and then performs actions.

Describe alternatives you've considered:

Additional context: Obviously it would save money and setup time to not having to deal with a separated movement sensor.

Sunoo commented 3 years ago

One user of this plugin wrote a guide for using Motion to generate the motion alerts: https://sunoo.github.io/homebridge-camera-ffmpeg/automation/motion.html

I've been intending to clean that document up for a bit, but haven't had the chance. I know some users have set this up with MotionEye, but I don't believe they've shared their configs for having motion flow through.

Is there something specific you are looking to be added to this plugin, or would it just be better documentation for using those together?

AssetBurned commented 3 years ago

hmmm I think this is a multilayer question you ask there. The documentation could use some general revamp for a more general approach to this topic.

But what bugs me is that if homebridge, this plugin & motioneye all runs on the same raspberry, why is there a need to add an MQTT setup into the mix? Shouldn't some build in scripts do the jobs? I mean a script that comes with this plugin and that can be just called via the motioned interface.

Using the automation web server this script offers, allows at least to use a curl command as start and end. That is already a big simplification to the setup (and the documentation).

Still it means having a third web service open, plus all the streaming ports.

NorthernMan54 commented 3 years ago

Personally I find that motioneye requires a dedicated rpi due to the cpu overhead when using it with multiple cameras.

Sunoo commented 3 years ago

When I rewrite it, I plan to do so using the HTTP service instead of MQTT, since as you said, that’s a bit easier. I have not used Motion myself though, so I can’t speak for much of it. When I have some time, I do plan to experiment with it, if for no other reason than to make sure whatever documentation I write is accurate.

AssetBurned commented 3 years ago

Personally I find that motioneye requires a dedicated rpi due to the cpu overhead when using it with multiple cameras.

Yeah looking at video stream (not to mention the CPU load) is a bit scary. I have a test setup with an RP3B+ and an cheap chinese knock off camera. Still the CPU jumps up to 70% when I wave the hand in front of the camera. Nothing else than a basic Homebridge setup with just this plugin and MotionEye. The created video files are fine, but not the stream forwarded to the iPhone.

If I would deploy this setup, would just dedicate one Pi per camera, and one of those Homebridge instances.

That is why i was mentioning in my original post FFmpeg, i think it also has some basic functionality for detecting if something major has changed in the frame.

Edit: When I disable the recording on that Pi i get the CPU load down to ~50% and the video is smooth on the iPhone. Changing that the video stream should have a from only 75% to 100% of the original quality doesn't make a difference.

Sunoo commented 3 years ago

I can look into the FFmpeg method when I have a chance (may be a bit, life is crazy right now), but I’d be surprised if it was meaningfully better for CPU usage than the other methods. It’d still need to be an extra instance of FFmpeg running at all times, and it would need to be doing some amount of analysis of each frame as they come in.

NorthernMan54 commented 3 years ago

@AssetBurned Personally after trying out MotionEye for a few weeks, and seeing how it improved the usability of my Wyze and Eufy cams, I bit the bullet and dedicated a RPI 3B+ for 2 camera's. I went from flakiness with the wye cams running the RTSP firmware to rock solid reliability, with the added bonus of local storage of motion events. It also simplified management of them, as I now just use the MotionEye console rather than multiple vendor apps.

To get the CPU manageable, I did tune the config and reduced the frame rate of the web console and other tweaks and reduced any transcoding on the RPI.

PS Here is my CPU load ( 2 is 50% cpu, 3 is 75% cpu )

image

AssetBurned commented 3 years ago

@NorthernMan54 I assume you use external cameras ? Can you share the configs and the setup?

github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.