Open CarterBrehm opened 4 years ago
What kind of cameras are you using?
I'm using a Video Doorbell, two Q's, and I think I just have regular Arlo Pro cameras for the outside. 6 cameras in total. I've done some more experimenting and it seems like sometimes it will show the last recorded motion event, but I haven't gotten it to stream yet.
I think this needs fixing with the stream component. I'll see if somebody is willing to lend me an Arlo Q camera to test.
Hi!
Just adding that I've similars issues with Arlo Q and livestream:
2020-02-17 20:42:51 INFO (MainThread) [homeassistant.components.stream] Started stream: rtsps://vzwow165-z1-prod.ar.arlo.com:443/vzmodulelive/...?egressToken=token&userAgent=iOS&cameraId=cameraid
2020-02-17 20:42:54 INFO (SyncWorker_12) [homeassistant.components.netgear.device_tracker] Scanning
2020-02-17 20:42:54 INFO (SyncWorker_12) [pynetgear] Get attached devices 2
2020-02-17 20:42:57 ERROR (stream_worker) [homeassistant.components.stream.worker] Error demuxing stream: No dts in packet
2020-02-17 20:42:57 INFO (MainThread) [homeassistant.components.stream] Stopped stream: rtsps://vzwow165-z1-prod.ar.arlo.com:443/vzmodulelive/...?egressToken=token&userAgent=iOS&cameraId=cameraid
2020-02-17 20:42:57 ERROR (MainThread) [aiohttp.server] Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_protocol.py", line 418, in start
resp = await task
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_app.py", line 458, in _handle
resp = await handler(request)
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_middlewares.py", line 119, in impl
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/real_ip.py", line 39, in real_ip_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/ban.py", line 72, in ban_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/auth.py", line 135, in auth_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/view.py", line 123, in handle
result = await result
File "/usr/src/homeassistant/homeassistant/components/stream/core.py", line 181, in get
return await self.handle(request, stream, sequence)
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 36, in handle
body=renderer.render(track, utcnow()).encode("utf-8"), headers=headers
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 95, in render
+ self.render_playlist(track, start_time)
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 67, in render_preamble
return ["#EXT-X-VERSION:3", f"#EXT-X-TARGETDURATION:{track.target_duration}"]
File "/usr/src/homeassistant/homeassistant/components/stream/core.py", line 83, in target_duration
return round(sum(durations) // len(self._segments)) or 1
ZeroDivisionError: integer division or modulo by zero
Is anybody comfortable with debugging from Chrome? I want to see if the command to start streaming is different for Q cameras. Basically:
The debug window will be filling up. Click on the Network
tag.
Look for a packet in the Name
window called startStream
and select it. I'm interested in the Headers
and Preview
tabs of the packet.
In the Headers
tag scroll down to the Request Payload
, expand it as much as possible and paste it anonymized here.
In the Preview
tag expand the response as much as possible and paste it anonymized here.
For my cameras they look something like this:
to: "4XXXXXXXXXX8"
from: "XXXXXXXXXX_web"
resource: "cameras/5XXXXXXXXXX"
action: "set"
responseUrl: ""
publishResponse: true
transId: "web!7ff03760.a1db6!1582143108843"
properties: {activityState: "startUserStream", cameraId: "5XXXXXXX"}
activityState: "startUserStream"
cameraId: "5XXXXXXXXXX"
and
data: {,…}
url: "rtmps://vzwow92-z1-prod.ar.arlo.com:80/vzmodulelive?egressToken=306d4a2XXXXXXXXXXXXXXXX&userAgent=web&cameraId=XXXXXXXXX_1582143108927"
success: true
Here's for a Arlo Q camera:
to: "4XXXXXXXXXXX1"
from: "XXXX-XXX-XXXXXXXX_web"
resource: "cameras/4XXXXXXXXXXX1"
action: "set"
responseUrl: ""
publishResponse: true
transId: "web!ecb1f6aa.7fea5!1582144947531"
properties: {activityState: "startUserStream", cameraId: "4XXXXXXXXXXX1"}
activityState: "startUserStream"
cameraId: "4XXXXXXXXXXX1"
data: {,…}
url: "rtmps://vzwow52-z1-prod.ar.arlo.com:80/vzmodulelive?egressToken=mytoken&userAgent=web&cameraId=mycameraid__1582144961554"
success: true
Seems like there's a little difference in the url, vzwow92 for you and vzwow52 for me.
Thanks for the quick reply. Which OS was that from?
And yes, there was no significant difference at all. I was hoping there was something then I could have added it to the code for Q cameras and see if it fixed the problem.
Another option is a different user agent. This is a long shot but might work with the Arlo Q cameras. Add this to your config and restart HA.
aarlo:
user_agent: linux
The previous info was from macOS 10.15.3 (Chrome 80.0.3987.87).
With your line, I ain't got the error, but the stream doesn't work either.
2020-02-20 08:43:58 INFO (MainThread) [homeassistant.components.stream] Started stream: rtmps://vzwow78-z1-prod.ar.arlo.com:80/vzmodulelive?egressToken=mytoken&userAgent=web&cameraId=mycameraid_1582184633984
Now it gets fun - we need to check if the stream is actually readable by the ffmpeg
executable inside home-assistant. This gets complicated and I'm hoping somebody out there will have some coding/console/home-assistant-messing-about experience and can try this with an ArloQ camera. What we need to do is:
stream
componentffmpeg
connection to it and capture that outputLuckily we can jerry-rig (jury-rig?) most of this.
This patch will request Arlo start the stream but stop the URL being passed back to the stream
component. Apply it directly to the Aarlo custom component under your home assistant configuration directory - It's one line so probably easiest to apply it manually.
diff --git a/custom_components/aarlo/pyaarlo/camera.py b/custom_components/aarlo/pyaarlo/camera.py
index 591f331..d8c123e 100644
--- a/custom_components/aarlo/pyaarlo/camera.py
+++ b/custom_components/aarlo/pyaarlo/camera.py
@@ -475,7 +475,7 @@ class ArloCamera(ArloChildDevice):
return None
url = reply['url'].replace("rtsp://", "rtsps://")
self._arlo.debug('url={}'.format(url))
- return url
+ return None
def get_video(self):
video = self.last_video
Make sure debug is turned on. Set logger:
in configuration.yaml
to this:
logger:
default: info
logs:
pyaarlo: debug
Restart your home-assistant.
Cut and paste the following shell script into your home-assistant configuration directory, call it stream
.
#!/bin/bash
#
URL=$(grep url= home-assistant.log | tail -1 | cut -f 2- -d =)
if [[ -z "${URL}" ]]; then
echo "no url"
exit 1
fi
echo $URL
mkdir s 2>/dev/null
ffmpeg -i ${URL} \
-fflags flush_packets -max_delay 2 -flags -global_header \
-hls_time 2 -hls_list_size 3 -vcodec copy -y s/video.m3u8
Enter your home-assistant environment and make stream executable. The first time you run it there won't be a stream so it will do nothing. I'm using a docker so my commands goes like this:
$ docker exec -it home-assistant bash
bash-5.0# chmod +x stream
bash-5.0# ./stream
no url
For a virtualenv
install it's probably sufficient to source the appropriate activate
file and change to your config directory. Now start a live stream, wait a few seconds and run the stream
command again. This time it should try and connect to the Arlo servers and start generating hls
output. For my non-ArloQ cameras it looks like this:
bash-5.0# ./stream
rtsps://vzwow117-z1-prod.ar.arlo.com:443/vzmodulelive/XXXXXXXXXX_1582255723060?egressToken=fcf7187b_54b0_4f3a_9ea1_0a80943e00f0&userAgent=iOS&cameraId=XXXXXXXXX_F23_1582255723060
ffmpeg -fflags nobuffer -rtsp_transport tcp -i rtsps://vzwow117-z1-prod.ar.arlo.com:443/vzmodulelive/XXXXXXXXXX_1582255723060?egressToken=fcf7187b_54b0_4f3a_9ea1_0a80943e00f0&userAgent=iOS&cameraId=XXXXXXXXXX_1582255723060 -vsync 0 -copyts -vcodec copy -movflags frag_keyframe+empty_moov -an -hls_flags delete_segments+append_list -f segment -segment_list_flags live -segment_time 1 -segment_list_size 3 -segment_format mpegts -segment_list s/index.m3u8 -segment_list_type m3u8 -segment_list_entry_prefix /stream/ s/%d.ts
ffmpeg version 4.1.4 Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 8.3.0 (Alpine 8.3.0)
configuration: --prefix=/usr --enable-avresample --enable-avfilter --enable-gnutls --enable-gpl --enable-libass --enable-libmp3lame --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb --disable-stripping --disable-static --disable-librtmp --enable-vaapi --enable-vdpau --enable-libopus --disable-debug
libavutil 56. 22.100 / 56. 22.100
libavcodec 58. 35.100 / 58. 35.100
libavformat 58. 20.100 / 58. 20.100
libavdevice 58. 5.100 / 58. 5.100
libavfilter 7. 40.101 / 7. 40.101
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 3.100 / 5. 3.100
libswresample 3. 3.100 / 3. 3.100
libpostproc 55. 3.100 / 55. 3.100
Input #0, rtsp, from 'rtsps://vzwow117-z1-prod.ar.arlo.com:443/vzmodulelive/XXXXXXX3_1582255723060?egressToken=fcf7187b_54b0_4f3a_9ea1_0a80943e00f0&userAgent=iOS&cameraId=XXXXXXXXXX_1582255723060':
Metadata:
title : XXXXXXXXXX_1582255723060
Duration: N/A, start: 0.320000, bitrate: N/A
Stream #0:0: Audio: aac (LC), 16000 Hz, mono, fltp
Stream #0:1: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1072, 15 fps, 15 tbr, 90k tbn, 30 tbc
Stream mapping:
Stream #0:1 -> #0:0 (copy)
Stream #0:0 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
[hls @ 0x55de5a79ed00] Opening 's/video0.ts' for writing
Output #0, hls, to 's/video.m3u8':
Metadata:
title : XXXXXXXXXX_1582255723060
encoder : Lavf58.20.100
Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1072, q=2-31, 15 fps, 15 tbr, 90k tbn, 15 tbc
Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp, 69 kb/s
Metadata:
encoder : Lavc58.35.100 aac
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50970, current: 21060; changing to 50971. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50971, current: 27000; changing to 50972. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50972, current: 33030; changing to 50973. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50973, current: 38970; changing to 50974. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50974, current: 45000; changing to 50975. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Non-monotonous DTS in output stream 0:0; previous: 50975, current: 50940; changing to 50976. This may result in incorrect timestamps in the output file.
[hls @ 0x55de5a79ed00] Opening 's/video1.ts' for writing
[hls @ 0x55de5a79ed00] Cannot use rename on non file protocol, this may lead to races and temporary partial files
[hls @ 0x55de5a79ed00] Opening 's/video2.ts' for writingrate=N/A speed=1.97x
[hls @ 0x55de5a79ed00] Opening 's/video3.ts' for writingrate=N/A speed=1.48x
[hls @ 0x55de5a79ed00] Opening 's/video4.ts' for writingrate=N/A speed=1.43x
frame= 145 fps= 19 q=-1.0 Lsize=N/A time=00:00:10.28 bitrate=N/A speed=1.35x
video:1035kB audio:67kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[aac @ 0x55de5a77fc00] Qavg: 65523.383
Exiting normally, received signal 2.
This diff might be easier to try. It will turn on debugging to the Python av component. You have to make this change inside the homeassistant code.
diff --git a/homeassistant/components/stream/__init__.py b/homeassistant/components/stream/__init__.py
index d88f90a83..308dc5b77 100644
--- a/homeassistant/components/stream/__init__.py
+++ b/homeassistant/components/stream/__init__.py
@@ -37,7 +37,9 @@ SERVICE_RECORD_SCHEMA = STREAM_SERVICE_SCHEMA.extend(
}
)
# Set log level to error for libav
-logging.getLogger("libav").setLevel(logging.ERROR)
+logging.getLogger("libav").setLevel(logging.DEBUG)
+logging.getLogger().setLevel(5)
+
@bind_hass
Restart homeassistant, start a stream and you'll see something like this in the logging which might help us work out what is going wrong:
2020-02-21 09:55:38 WARNING (stream_worker) [libav.mpegts] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
2020-02-21 09:55:38 WARNING (stream_worker) [libav.mpegts] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
2020-02-21 09:55:40 INFO (stream_worker) [libav.aac] Qavg: 65536.000
2020-02-21 09:55:40 WARNING (stream_worker) [libav.aac] 2 frames left in the queue on closing
2020-02-21 09:55:40 WARNING (stream_worker) [libav.mpegts] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
2020-02-21 09:55:40 WARNING (stream_worker) [libav.mpegts] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
2020-02-21 09:55:42 INFO (stream_worker) [libav.aac] Qavg: 65536.000
Only have a hassio setup, but taking the stream url and trying on my host with:
ffmpeg -version
ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers
built with Apple clang version 11.0.0 (clang-1100.0.33.16)
configuration: --prefix=/usr/local/Cellar/ffmpeg/4.2.2 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/adoptopenjdk-13.0.1.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/adoptopenjdk-13.0.1.jdk/Contents/Home/include/darwin -fno-stack-check' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
ffmpeg -v verbose -fflags flush_packets -max_delay 2 -flags -global_header -hls_time 2 -hls_list_size 3 -vcodec copy -y s/video.m3u8 -i "rtmps://vzwow388-z2-prod.ar.arlo.com:80/vzmodulelive?egressToken=XXXXXX&userAgent=web&cameraId=XXXX"
Parsing...
Parsed protocol: 4
Parsed host : vzwow388-z2-prod.ar.arlo.com
Parsed app : vzmodulelive?egressToken=XXXXX&userAgent=web&cameraId=XXXXXX
HandShake: Type Answer : 03
HandShake: Server Uptime : 96850052
HandShake: FMS Version : 3.0.1.1
HandShake: Handshaking finished....
RTMP_Connect1, handshaked
Invoking connect
HandleServerBW: server BW = 2500000
HandleClientBW: client BW = 2500000 2
HandleCtrl, received ctrl. type: 0, len: 6
HandleCtrl, Stream Begin 0
HandleChangeChunkSize, received: chunk size change to 512
RTMP_ClientPacket, received: invoke 336 bytes
(object begin)
Property: <Name: no-name., STRING: _error>
Property: <Name: no-name., NUMBER: 1.00>
Property: NULL
Property: <Name: no-name., OBJECT>
(object begin)
Property: <Name: level, STRING: error>
Property: <Name: code, STRING: NetConnection.Connect.Rejected>
Property: <Name: description, STRING: Connection failed: Application rejected connection.>
Property: <Name: application, STRING: invalid request>
Property: <Name: ex, OBJECT>
(object begin)
Property: <Name: redirect, STRING: Not valid request>
Property: <Name: code, NUMBER: 302.00>
(object end)
Property: <Name: clientid, NUMBER: 2037XXXXXX.00>
Property: <Name: secureToken, STRING: XXXXXXXXXXXXXX>
(object end)
(object end)
HandleInvoke, server invoking <_error>
rtmp server sent error
RTMP_ClientPacket, received: invoke 18 bytes
(object begin)
Property: <Name: no-name., STRING: close>
Property: <Name: no-name., NUMBER: 0.00>
Property: NULL
(object end)
HandleInvoke, server invoking <close>
rtmp server requested close
This is interesting: https://github.com/tchellomello/python-arlo/issues/8
So when I make a request to https://my.arlo.com/hmsweb/users/devices/startStream
with that Mozilla/5.0 (iPhone; CPU iPhone OS 11_1_2 like Mac OS X) AppleWebKit/604.3.5 (KHTML, like Gecko) Mobile/15B202 NETGEAR/v1 (iOS Vuezone)
user agent, I do get a rtsp://
url back.
Changing the protocol then to rtsps://
allows me to capture the stream via ffmpeg -v verbose -re -i "rtsps://vzwow233-z2-prod.ar.arlo.com:443/vzmodulelive/xxx?egressToken=xxx&userAgent=iOS&cameraId=xxxx" -acodec copy -vcodec copy ./test.mp4
which I can then successfully play.
I already fix up the rtsp://
to rtsps://
, it's one of the many Arlo quirks!
And in the previous post you're getting a rtmps:
stream, do you have a user_agent
configured?
When I used the linux
user agent, I got an rtmps:
url that I wasn't able to stream. Turns out with the default settings you have I already got that rtsp://
stream, so I removed the user_agent
config again.
In the end, the problem with my Q cameras came down to this: https://github.com/home-assistant/home-assistant/blob/dev/homeassistant/components/stream/worker.py#L77
Changing that to not throw an error and to just ignore the packet gets me a stream in the UI. Also found some earlier discussion about this here: https://github.com/home-assistant/home-assistant/issues/22840#issuecomment-512593730
Haven't contributed to HA so far so don't have a local setup yet but will see if we can get that change into there.
That's great you got it going. I was leaning towards the problem being in the stream
component but I like to get as much information as possible before raising the issue.
And many thanks for your work here.
Does this patch look something similar to how you get it working? I just ignore the first 10 dts errors.
--- /usr/src/homeassistant.orig/homeassistant/components/stream/worker.py
+++ /usr/src/homeassistant/homeassistant/components/stream/worker.py
@@ -66,6 +66,8 @@
first_pts = 0
# The decoder timestamp of the latest packet we processed
last_dts = None
+ # Allow some no dts packets in
+ no_dts = 0
while not quit_event.is_set():
try:
@@ -73,8 +75,13 @@
if packet.dts is None:
if first_packet:
continue
+ ++no_dts
+ if no_dts < 10:
+ continue
# If we get a "flushing" packet, the stream is done
- raise StopIteration("No dts in packet")
+ raise StopIteration("No dts in packet too often")
+ else:
+ no_dts = 0
except (av.AVError, StopIteration) as ex:
# End of stream, clear listeners and stop thread
for fmt, _ in outputs.items():
For the quick test I didn't have a counter, but this seems fine!
Does the latest update fix the arlo q live stream if it does. ive updated and its doing the same thing no live stream.
I can't fix it from inside the Aarlo component you need to patch the stream:
component and try.
You can copy the patch in the previous+2 post into a file called /config/aarlo.patch
and reboot twice - the first time to patch stream, the second to use the newly patched stream - and see if that works.
Do i need to create that file? and then just copy the patch from the above comment from you into that file.
For now, yes. I need to know where the file is and the config directory is the easiest to locate.
I'll work on a nicer way of doing this.
For now, yes. I need to know where the file is and the config directory is the easiest to locate.
I'll work on a nicer way of doing this.
if it goes wrong anyway to revert the stream componment back to how it was? will a snap shot work or?
This is a reverse of the diff, copy this to /config/aarlo.patch
and that should fix it. Are you using docker or virtualenv?
--- /usr/src/homeassistant/homeassistant/components/stream/worker.py
+++ /usr/src/homeassistant.orig/homeassistant/components/stream/worker.py
@@ -66,8 +66,6 @@
first_pts = 0
# The decoder timestamp of the latest packet we processed
last_dts = None
- # Allow some no dts packets in
- no_dts = 0
while not quit_event.is_set():
try:
@@ -75,13 +73,8 @@
if packet.dts is None:
if first_packet:
continue
- ++no_dts
- if no_dts < 10:
- continue
# If we get a "flushing" packet, the stream is done
+ raise StopIteration("No dts in packet")
- raise StopIteration("No dts in packet too often")
- else:
- no_dts = 0
except (av.AVError, StopIteration) as ex:
# End of stream, clear listeners and stop thread
for fmt, _ in outputs.items():
im using docker on ubuntu 18.04
ive not tried it yet i just wanted to know how to get back to original if it didnt work. does this patch fix it to allow live streaming for a q on the custom aarlo card i haven't miss read have i?
do you think this will be easy to fix in the future or will this patch be required anyway?
With a docker you can just throw the docker instance away and start again.
The patch should allow Arlo Q cameras to stream. I don't have one so I'm relying on people with Q cameras to let me know if it works.
It is a simple fix but looking at the comments in the stream threads in home-assistant.io it's been known about for a while.
@twrecked I can give this a shot. Are you saying to literally copy that diff below (with the pluses and minuses) into that location?
--- /usr/src/homeassistant.orig/homeassistant/components/stream/worker.py
+++ /usr/src/homeassistant/homeassistant/components/stream/worker.py
@@ -66,6 +66,8 @@
first_pts = 0
# The decoder timestamp of the latest packet we processed
last_dts = None
+ # Allow some no dts packets in
+ no_dts = 0
while not quit_event.is_set():
try:
@@ -73,8 +75,13 @@
if packet.dts is None:
if first_packet:
continue
+ ++no_dts
+ if no_dts < 10:
+ continue
# If we get a "flushing" packet, the stream is done
- raise StopIteration("No dts in packet")
+ raise StopIteration("No dts in packet too often")
+ else:
+ no_dts = 0
except (av.AVError, StopIteration) as ex:
# End of stream, clear listeners and stop thread
for fmt, _ in outputs.items():
Okay, so I gave it a go, and the results don't look good.
On first restart, I get the log to show up (It seems it logs this as an error even though it's really an info log):
Log Details (ERROR)
Sun Mar 08 2020 12:25:41 GMT-0500 (Central Daylight Time)
/usr/bin/patch -p0 -N < '/config/aarlo.patch'
After second restart, trying to stream from the aarlo lovelace card makes this show up in the logs (same stuff from before):
Log Details (ERROR)
Sun Mar 08 2020 12:32:15 GMT-0500 (Central Daylight Time)
Error demuxing stream: No dts in packet
Log Details (ERROR)
Sun Mar 08 2020 12:32:15 GMT-0500 (Central Daylight Time)
Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_protocol.py", line 418, in start
resp = await task
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_app.py", line 458, in _handle
resp = await handler(request)
File "/usr/local/lib/python3.7/site-packages/aiohttp/web_middlewares.py", line 119, in impl
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/real_ip.py", line 39, in real_ip_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/ban.py", line 72, in ban_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/auth.py", line 135, in auth_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/view.py", line 123, in handle
result = await result
File "/usr/src/homeassistant/homeassistant/components/stream/core.py", line 181, in get
return await self.handle(request, stream, sequence)
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 36, in handle
body=renderer.render(track, utcnow()).encode("utf-8"), headers=headers
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 95, in render
+ self.render_playlist(track, start_time)
File "/usr/src/homeassistant/homeassistant/components/stream/hls.py", line 67, in render_preamble
return ["#EXT-X-VERSION:3", f"#EXT-X-TARGETDURATION:{track.target_duration}"]
File "/usr/src/homeassistant/homeassistant/components/stream/core.py", line 83, in target_duration
return round(sum(durations) // len(self._segments)) or 1
ZeroDivisionError: integer division or modulo by zero
I want to say this is probably because the patching code in the component is not finding the right place to apply the patch. I'm running homeassistant through docker on an Ubuntu VM, and I don't seem to have a /usr/src/homeassistant
folder. In order to apply this patch, do I need to be running homeassistant from source code?
The stream
component needs to be fixed to handle this error better, the patch is a huge kludge.
But, it should work on a docker, you'll have problems with virtualenv but docker has a fixed source location. Did you check from inside the docker?
$ docker exec -it your-docker-instance-name bash
# cd /usr/src
# ls
I'll work on generating a PR for the stream component. But it'll take me a couple of days to do it.
@twrecked after "bashing" into the docker instance, I can see the /usr/src location. I opened up the worker.py
file and the patch is definitely not applied. Here's what the relevant lines look like on mine:
while not quit_event.is_set():
try:
packet = next(container.demux(video_stream))
if packet.dts is None:
if first_packet:
continue
# If we get a "flushing" packet, the stream is done
raise StopIteration("No dts in packet")
except (av.AVError, StopIteration) as ex:
# End of stream, clear listeners and stop thread
for fmt, _ in outputs.items():
hass.loop.call_soon_threadsafe(stream.outputs[fmt].put, None)
_LOGGER.error("Error demuxing stream: %s", str(ex))
break
I'm currently running homeassistant v0.104.3. Do you think that might be why the patch isn't applying?
@twrecked were you able to get a PR in for the stream component?
No. Sorry. Real life and work got in the way.
Can somebody try https://github.com/twrecked/hass-aarlo/releases/tag/v0.6.18.3 with an Arlo Q camera.
Not sure it will fix it but I can set the options I pass to the ffmpeg library.
So I gave v0.6.18.3 a go, and, the results are strange.
When I click on the aarlo glance card, it acts like it is working, but the image being shown never updates. Below is what the card looks like while "streaming":
There are no errors posted in the logs, and it looks like a stream is setup, so I don't know what's going on. Is there another way to view the stream other than the aarlo card?
Here's the debug logs (I sanitized some stuff, but I honestly don't know how much of this is sensitive data):
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/activityState=startUserStream
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=startUserStream
2020-03-29 09:19:01 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:startUserStream
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/activityState=startUserStream
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:19:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=startUserStream
2020-03-29 09:19:01 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:startUserStream
2020-03-29 09:19:03 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:19:03 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:19:03 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/activityState=userStreamActive
2020-03-29 09:19:03 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:19:03 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=userStreamActive
2020-03-29 09:19:03 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:userStreamActive
2020-03-29 09:19:05 DEBUG (MainThread) [pyaarlo] url=rtsps://vzwow954-z2-prod.ar.arlo.com:443/vzmodulelive/xxx?egressToken=xxx&userAgent=iOS&cameraId=xxx
2020-03-29 09:19:05 INFO (MainThread) [homeassistant.components.stream] Started stream: rtsps://vzwow954-z2-prod.ar.arlo.com:443/vzmodulelive/xxx?egressToken=xxx&userAgent=iOS&cameraId=xxx
2020-03-29 09:19:05 INFO (MainThread) [homeassistant.components.stream] Started stream: rtsps://vzwow954-z2-prod.ar.arlo.com:443/vzmodulelive/xxx?egressToken=xxx&userAgent=iOS&cameraId=xxx
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] sending mediaUploadNotification to 4SS176SX42771
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got mediaUploadNotification
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one mediaUploadNotification
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/mediaObjectCount=987480
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/presignedLastImageUrl=https://arlolastimage-z2.s3.a
2020-03-29 09:19:35 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:presignedLastImageUrl:https://arlolastimage-z2.s3.amazonaws.com/xxx/M
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo thumbnail changed
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] queing image update
2020-03-29 09:19:35 DEBUG (ArloEventStream) [pyaarlo] turning recent ON for Lilly's Arlo
2020-03-29 09:19:35 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:recentActivity:True
2020-03-29 09:19:35 DEBUG (ArloBackgroundWorker) [pyaarlo] getting image for Lilly's Arlo
2020-03-29 09:19:36 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:19:36 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:19:36 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/activityState=idle
2020-03-29 09:19:36 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:19:36 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=idle
2020-03-29 09:19:36 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:idle
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/lastImageSource=capture/03-29 09:19
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/lastCapture=03-29 09:19
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/presignedLastImageData=b'\xff\xd8\xff\xe0\x00\x10JF
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [custom_components.aarlo.camera] callback:Lilly's Arlo:presignedLastImageData:b'\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01\x02\x00\x01\xdc\x01\xdb\x00\x00\xff\xfe\x
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=idle
2020-03-29 09:19:36 DEBUG (ArloBackgroundWorker) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:idle
2020-03-29 09:19:37 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:19:37 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:19:37 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/activityState=idle
2020-03-29 09:19:37 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:19:37 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=idle
2020-03-29 09:19:37 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:idle
2020-03-29 09:19:42 DEBUG (ArloBackgroundWorker) [pyaarlo] fast refresh
2020-03-29 09:19:42 DEBUG (ArloBackgroundWorker) [pyaarlo] day testing with 2020-03-29!
2020-03-29 09:20:01 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:20:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:20:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/audioDetected=True
2020-03-29 09:20:01 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:20:01 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/audioDetected=True
2020-03-29 09:20:04 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:20:04 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:20:04 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] sending mediaUploadNotification to 4SS176SX42771
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got mediaUploadNotification
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one mediaUploadNotification
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/mediaObjectCount=987481
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] our snapshot finished, downloading it
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/presignedFullFrameSnapshotUrl=https://arlos3-prod-z
2020-03-29 09:20:06 DEBUG (ArloEventStream) [pyaarlo] turning recent ON for Lilly's Arlo
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [pyaarlo] getting image for Lilly's Arlo
2020-03-29 09:20:06 DEBUG (ArloEventStream) [custom_components.aarlo.camera] callback:Lilly's Arlo:recentActivity:True
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/lastImageSource=snapshot/03-29 09:20
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/presignedLastImageData=b'\xff\xd8\xff\xe0\x00\x10JF
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [custom_components.aarlo.camera] callback:Lilly's Arlo:presignedLastImageData:b'\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01\x01\x00\x00\x01\x00\x01\x00\x00\xff\xdb\x
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [pyaarlo] set:ArloCamera/4SS176SX42771/activityState=idle
2020-03-29 09:20:06 DEBUG (ArloBackgroundWorker) [custom_components.aarlo.camera] callback:Lilly's Arlo:activityState:idle
2020-03-29 09:20:07 DEBUG (ArloEventStream) [pyaarlo] sending cameras/4SS176SX42771 to 4SS176SX42771
2020-03-29 09:20:07 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo BASE got cameras/4SS176SX42771
2020-03-29 09:20:07 DEBUG (ArloEventStream) [pyaarlo] set:ArloBase/4SS176SX42771/audioDetected=False
2020-03-29 09:20:07 DEBUG (ArloEventStream) [pyaarlo] Lilly's Arlo CAMERA got one cameras/4SS176SX42771
2020-03-29 09:20:07 DEBUG (ArloEventStream) [pyaarlo] set:ArloCamera/4SS176SX42771/audioDetected=False
2020-03-29 09:20:08 DEBUG (MainThread) [custom_components.aarlo.camera] stop_activity for lilly's_arlo
Same here. I don't see the demux errors anymore, but the stream doesn't load and the request for the playlist fails eventually without and errors in the logs.
Love for this to get fixed is it something HA needs to fix cant we bug them lol.
I tried to make it work without changing the stream
component but I don't think that's possible.
So I started working a PR for the main Home Assistant code but real-life got in the way.
I'll try and aim for this weekend.
I'm facing the same issue, do you think the root cause is related to the stream component? I tried the patch you mentioned before and it works
I believe the issue is in the stream component. I started work on a PR but real work got in the way. I think adding an option to ignore a certain number of DTS packets should be sufficient and acceptable to the stream
author.
Are we any closer to getting the stream bit patched? love to have it fixed but i not a fan of modifying bits in case it breaks something down the line.
Almost... I have to generate a patch for the detected I/O issue so I'm using that as a test. But if anybody wants to generate a PR, be my guest.
Almost... I have to generate a patch for the detected I/O issue so I'm using that as a test. But if anybody wants to generate a PR, be my guest.
I would love to help but have no idea where to start :(
I have prepared this change:
https://github.com/dermotduffy/hass-core/commit/73b0d2fd14e783cd015c5f065b457f009f243300
Skipping a single packet without dts is sufficient for my Arlo Qs to stream correctly (and somehow psychologically skipping 1 bad packet feels less kludgey than skipping a const 10!).
@twrecked you appear to be working on this issue, and you know this area much better than I. Would you like me to submit this as a PR, or should I hold off (e.g. if you have a better fix in mind)?
@dermotduffy If you could submit it that would be great. I don't have any of the cameras streaming doesn't work on so didn't have a way of testing before I submitted it.
It now looks like Arlo has changed their streaming. I connected to their website today and the cameras are streaming in HLS directly from Arlo. This probably means we can bypass the stream:
component and stream directly from Arlo. Obviously I need to figure out how.
Can somebody with Ultra cameras confirm this? You will need to open a debug console (CTRL + SHIFT + T in Chrome) and start a stream. Look for m4s packets.
I have a fix but it needs trying. As I mentioned above, it looks like Arlo is moving away from flash
and is now using mpeg-dash
to stream from their cameras. This means we can stream directly from Arlo instead of via Home Assistant. I modified the lovelace
card to make this work. To try this, you need to do the following:
aarlo:
user_agent: linux
play_direct: true
Now try streaming. One bonus, the audio now works.
To undo, just remove the play_direct
and user_agent
changes. The lovelace card is backward compatible.
If somebody with ultra cameras could try this it would be great.
Nice, works great with my Arlo Q cameras!
Hurray -- it's a proper fix for streaming from Q cameras that doesn't rely on arbitrarily skipping packets without dts...
Confirmed streaming working on Arlo Qs and Pro 2s.
@dermotduffy Yeah, I just happened to notice the web interface was using mpeg-dash the other day.
The problem is, the user_agent
breaks the stream:
component which doesn't recognize the mpd
type so no saving streams - you can try this by removing the play_direct
from the Lovelace card. So we still might still need your PR.
@twrecked, this doesn't seem to have an affect on the Arlo Pro 3 cameras. I can live stream just like I could before without audio. And anything that is recorded I can playback but will get no video, but I will get audio.
@ssilence5 It doesn't affect the playback of library files, they still come in in an mpg file. Do the library files play on web interface?
But you should get audio on the live stream. Can you check that on the web interface? We may need to add some parameters to the setup.
I have v0.6.8 of this integration installed through HACS in a virtualenv and while all of the sensors and snapshots work fine when I use the lovelace card the stream shows up but doesn't actually move at all, while the log reports errors:
Occasionally, I also get a
2020-01-18 13:28:00 WARNING (ArloEventStream) [pyaarlo] general exception must be str, not NoneType
which doesn't seem to be related to any action I take. I've followed the streaming instructions in the bottom of the readme, here are the relevant parts of my config:Let me know if I need to raise my logging level or anything.