Closed gerswin closed 5 years ago
Currently not. I will implement that as soon as I find the time. Keeping this issue for tracking.
Currently working on this, I hope to finish it by end of the week.
Would it be possible to get an example of a camera accessory? And, as an added bonus a quick tutorial on how you did it would be amazing. Thank you in advance.
Hi! Unfortunately the camera support is not ready yet, sorry. I can do a PR with whatever work I have done so far if you wish.
Basically, the logic is there, but there a few details to fill in - such as parsing exchanged values to the right format, etc. I do plan to finish this, but it’s hard to give a date rigth now. Still, if you need the PR to try to finish it yourself I wouldn’t mind
I don't know what PR means but I'd love to help if I can
I vote for this feature too, camera is in need.
Any chance you can push what you have so far @ikalchev? I can't seem to see anything in the camera
branch.
Yes sure. Let me put some comments here and there so that someone can pick it up. I suppose I will able to do it in a several hours from now.
Posted PR if you want to have a look
i think PR is pull requests
Any updates?
I am working on this whenever I can, but it's been hard to find the time lately. Will really try to finish this soon. Very sorry for this.
@ikalchev Don't apologise! You've done an amazing job making the parts of this that you have! Whenever you get around it is fantastic.
@schinckel Hey thanks man, really appreciate the nice words!
Still, I see people have interest in this feature, which I said will be finished months ago.
@ikalchev Great to hear this still has your attention! I think camera is the killer feature of HomeKit. Keep up the good work, Thanks!!
I haven’t forgotten you all, I think we are almost there
I‘m very much looking forward for this. Any updates?
Me too... can't wait to integrate my ip camera in my homekit setup.
Me too! Keep up the good work please!
Thanks all for the support and I am really very sorry this isn't finished yet. I hope to find the time the next 2-3 weeks to finish this.
Mate, I have changed from the now depreciated homebridge component for home-assistant which did allow me to intergrate my cameras via ffmpeg. I really wish u success on this coz this is the final straw in truly having all home devices “talking” to HomeKit. Best of luck bro.
And allow us to buy u a beer once this is done. I’d be happy to....
Thanks. I actually managed to find the time to work on that yesterday. I have an issue that I don’t encode some of the properties properly, but once this is sorted it should work (most of the logic already done). Also, I will also be using ffmpeg.
I will continue this week as well.
That’s brilliant mate. I’m sure u know that a huge crowd of us enthusiasts would be elated at this implementation
Hi all,
Pleased to announce that currently iOS and HAP-python can negotiate a stream session. HAP-python also starts ffmpeg, which does begin to stream. The issue is that at that point the Home app stays at "Loading..." and at some point gives up. I guess that either
Working on it, we are getting there!
Wow, just finished installing IP cameras. Such a bonus to see this still in the works thanks so much for your dedication!
I found a similar issue for homebridge here https://github.com/KhaosT/homebridge-camera-ffmpeg/issues/206
Is someone willing to sync latest from the camera_base branch and do python3 cam.py
to see if it reproduces? You will need to install ffmpeg in order to do this (little tricky on RPi, I am testing on a Mac. Which reminds me - I see ffmpeg is kind of deprecated in favour of avconv. The args should be the same though.) Also if you get an error that it cannot find the input device, try this to find yours:
ffmpeg -f avfoundation -list_devices true -i ""
and modify the FFMPEG_CMD in Camera.py.
As I see the same symptoms from he link above, I think that maybe I haven't set ffmpeg correctly. When I use the same command but with my local address instead of the iOS', I am able to play the stream locally (using ffplay). However, ffplay takes some time until it starts the video. I guess because it is waiting for PPS/SPS frames. One explanation is that iOS is also waiting for these frames and then timeouts.
hey Ivan, tested from the camera_base branch. negotiation successful from the 2nd attempt. camera added to the accessories. when start from iOS (12) hangs on the waiting state. prob same as in your case. I got the log DEBUG - attaching, maybe will help you to get this puzzle solved faster :) also there was a message for the unknown format (didn't pass to the log):
let me know if anything else to test .
cheers Alex
actually what I found. ffmpeg on my PI doesn't have avfoundation and can't find how to install it. basically can't make ffmpeg work properly. although the good news is it is very responsive to the ipohne interface - the moment you press camera from iOS' homekit - it tries to open / redirect the stream and there is errors in ffmpeg config stopping the stream. looks it is just one step away from success :)
got bit further. can make ffmpeg work (modified a few your params ** I never used ffmpeg before and maybe i did something awful sorry for it - tried to find suitable codecs / devices to make at least it works under pyhap's hood :) ) :
FFMPEG_CMD = ('ffmpeg -f video4linux2 -r 5 -i /dev/video0 -threads 0 '
'-vcodec libx264 -an -pix_fmt yuv422p -tune zerolatency '
'-vf scale={width}:{height} -b:v {bitrate}k -bufsize {bitrate}k '
'-payload_type 99 -f rtp '
'-srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params {video_srtp_key} '
'srtp://{address}:{video_port}?rtcpport={video_port}&'
'localrtcpport={local_video_port}&pkt_size=1378')
'''Template for the ffmpeg command.'''
ffmpeg can now stream (although I changed codecs and it could affect the result):
but no pic is delivered to homekit and after timeout it stops. communication between homekit iOS camera and pyhap is excellent - every click is visible in the logs (start streaming, stop streaming). attaching the log from the 'successful' ffmpeg start. camera2.txt
hey bro
got a few settings changed, some of them to make ffmpeg work anyhow (still have no avfoundation), but looks like the thing was in you using protocol rtp , and changing it to rtsp made the thing work. (found some docs here: http://manpages.ubuntu.com/manpages/bionic/man1/ffmpeg-protocols.1.html)
here is the my config changes to yours (clumsy bit sorry for that):
FFMPEG_CMD = ('ffmpeg -f video4linux2 -input_format h264 -video_size {width}x{height} -framerate 20 -i /dev/video0 '
'-vcodec copy -an -payload_type 99 -ssrc 1 -f rtsp '
'-b:v {bitrate}k -bufsize {bitrate}k '
'-payload_type 99 -f rtp '
'-srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params {video_srtp_key} '
'srtp://{address}:{video_port}?rtcpport={video_port}&'
'localrtcpport={local_video_port}&pkt_size=1378')
'''Template for the ffmpeg command.'''
and log of a full working cycle is here: camera3.txt
there are still a few things to fix (second attempt to access video wont' work until restarting hap-server etc) but whole thing is up.
thanks for your time putting all these together :)
cheers Alex
also works from outside of the local wifi - in 4G, but had to low down frame rate to 20.
Huge huge thanks @ignalex for trying this out!
As you said, there a few things to take care of, but it will be much quicker now that I know it works. I will try to push a beta to the dev branch this weekend. Will keep you posted.
Best, Ivan
Wow great work guys, really looking forward to this!
I'm really looking forward to this.... Thanks for all the hard work.
Just turned off Homebridge for good forgetting that I had 2 cameras via it so happy when I found out camera support was coming via homekit component. Looking forward to giving this a go.
For everyone eager to test this out - you can pull the camera_base branch and do python3 cam.py
. I am merging this to dev later today and probably to master after that.
Note that, depending on your device, you may need to plug you own command for starting the stream (e.g. ffmpeg, avconv, etc.)
Ok, the camera is now on the dev
branch. Take a look at camera_main.py
on how to start the thing. Take a look at pyhap.camera.FFMPEG_CMD
and the pyhap.camera.Camera
init args - as ignalex said earlier, you may need to tune the command for your hardware/software (use FFMPEG_CMD to see what arguments are available to you - there are even more if you want to dig into Camera._start_stream
.
Note that this is a beta and only 2 people have tried it out, so any feedback on improving this is welcome.
I’m a little confused how to test this, I’m happy to download a dev
branch and I thought it would just be a case of adding the relevant component and settings into the homekit entries within configuration.yaml
but I’m guessing that’s not the case. Any chance of a step by step setup? I’ve looked at the camera_main.py
and I’m lost.
Aptinline look at the camera.py there is ffmpeg starting command line. Make sure it works by itself first
I will test updates later this week when return from overseas
Sent from my iPhone
On 6 Oct 2018, at 23:18, aptonline notifications@github.com wrote:
I’m a little confused how to test this, I’m happy to download a dev branch and I thought it would just be a case of adding the relevant component and settings into the homekit entries within configuration.yanl but I’m guessing that’s not the case. Any chance of a step by step setup? I’ve looked at the camera_main.py and I’m lost.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
@aptonline I haven't added the camera to Home Assistant yet. Will try to do so next week.
Try
python3 camera_main.py
If not, you should see in the logs something like “starting stream command : the command”. Try playing with the arguments to make it fit for your software/hardware - you can change it directly into pyhap/camera.py#FFMPEG_CMD
Sorry @cdce8p I assumed this thread for was homekit integration into HA, I followed the link from a HA community post.
Newbie here, getting the following error when trying to run python3 camera_main.py
Traceback (most recent call last):
File "camera_main.py", line 5, in <module>
from pyhap.accessory_driver import AccessoryDriver
File "/HAP-python/pyhap/accessory_driver.py", line 37, in <module>
from zeroconf import ServiceInfo, Zeroconf
ImportError: No module named 'zeroconf'
@aptonline Can you try pip install -e .
in the /HAP-python
directory? That should install all missing dependencies. Assuming you're using the git repo and haven't installed HAP-python
though pip in the first place.
Newbie here, getting the following error when trying to run
python3 camera_main.py
Traceback (most recent call last): File "camera_main.py", line 5, in <module> from pyhap.accessory_driver import AccessoryDriver File "/HAP-python/pyhap/accessory_driver.py", line 37, in <module> from zeroconf import ServiceInfo, Zeroconf ImportError: No module named 'zeroconf'
you need to go through the full installation process (https://github.com/ikalchev/HAP-python#Installation)
this also could help http://www.kalitut.com/2017/11/raspberry-pi-set-up-zeroconf-bonjour.html
I was able to get the camera show up in the Home app, however, the video does not load. I guess I need to adjust the ffmpeg parameters but have no idea what to use. This is on macOS.
@ignalex I thought I had but now working :)
Camera added in Home add ok but no video stream. I'm guessing its because I couldn't find a way to authenticate to the camera with a username/password. I only saw the IP in camera_main.py
, have I missed something?
@aptonline The authentication is taken care of for you. Does the ffmpeg process start?
@hjtech I am having the same problem that I cannot start it on a macOS. The issue with me is that the resulting fps is around 5.
The command is very similar to implementation in HAP-nodejs/homebridge - did any of you used it?
@ikalchev can't make it work in the current dev branch this time. tried my ffmpeg config cmd what was working before. ffmpeg process starts
but no stream reaching iOS and in 30 secs it disconnects. log1.txt
Did you re-apply the fixes for your command? I see that the stream hasn't started
@aptonline The authentication is taken care of for you. Does the ffmpeg process start?
So it looks like its starting but no output:
ffmpeg version 3.2.12-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --prefix=/usr --extra-version='1~deb9u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Unknown input format: 'avfoundation'
[hap_server] Request PUT from address '('192.168.1.18', 49284)' for path '/characteristics'.
[hap_server] Set characteristics content: {'characteristics': [{'aid': 1, 'iid': 15, 'value': 'ARUCAQABEOZy2nWOE0YXnNH6hL2pYmQ='}]}
[characteristic] client_update_value: SelectedRTPStreamConfiguration to ARUCAQABEOZy2nWOE0YXnNH6hL2pYmQ=
[camera] Set stream config request: 0
[camera] [e672da75-8e13-4617-9cd1-fa84bda96264] Stopping stream.
[hap_server] 192.168.1.18 - "PUT /characteristics HTTP/1.1" 204 -
Also seeing Unknown input format: 'avfoundation'
in red if thats significant?
is possible?