Open adriancable opened 2 years ago
Following what happens with this, as have noticed also
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Did you uncover any further information on this? I have never read anything in the documentation about a "minimum bitrate". All docs always refer to a maximum bitrate? How do certified accessories behave?
@Supereg - I'm not sure if it's possible to answer that. It would require reverse engineering the certified accessory to see how it interprets that value.
One other point, though. As well as the iOS logs themselves referring to that value as the minimum bitrate (and also providing another, much higher value which is referred to as maximum bitrate), it really doesn't seem plausible to interpret that value as a maximum bitrate, despite what the docs say.
The numbers are often extremely small, implausibly low for a sensible bitrate. For example we frequently see that value being provided by HomeKit as 132000. (Usually, the flow is that the 'start' request has 299000, and then shortly afterwards a 'reconfigure' request with 132000.) 132kbit/sec just isn't a plausible maximum bitrate for H264 video, at 640 x 360 or 1280 x 720 resolution. It's too low to give anything like acceptable quality video. This also suggests that the docs are wrong (and the iOS logs are right).
I do note that the non-commercial HomeKit spec docs from Apple that are 'floating around' are very very old, and it's quite possible this has been corrected in newer versions. Unfortunately, I don't have access to (and do not want access to) the 'commercial' up-to-date docs so have no way of checking.
@adriancable have you monitored what iOS sends in future reconfigure stream requests? Maybe they start with the minimum bitrate as the „start“ maximum bitrate and iteratively increase the bitrate. The reconfigure stream request is something most plugins don’t really implement as there is no easy way to do it with ffmpeg (when using the command line interface).
@Supereg - actually, it's the opposite. Usually, the 'reconfigure' request drops the bitrate.
For example, initially:
{
tlvDatablob = (null)
syncSource = 1947509574
payloadType = 99
minimumBitrate = 299000
maximumBitrate = 1078000
rtcpInterval = 0.5
maxMTU = 1378
comfortNoisePayloadType = (null)
}
Then 20 seconds later there's a 'reconfigure' request with a new minimumBitrate = 132000.
Right now when starting a video stream, HAP-NodeJS passes something like this to handleStreamRequest on my delegate:
Note
max_bit_rate
and indeed, at least in the very old HomeKit docs from Apple I have, this is what that says, too. But I don't think it's right. I think it actually should bemin_bit_rate
. Looking at the console output fromhomed
on iOS, I see this:Note that what HAP-NodeJS returns as
max_bit_rate
matchesminimumBitrate
here (299000), notmaximumBitrate
(1078000).Also, I did an experiment: if I do send RTP data faster than what's called
maximumBitrate
in the homed logs for a few seconds, homed will kill the stream. So it does really appear that this is the enforcedmaximumBitrate
, i.e. the iOS logs from homed are correct and HAP-NodeJS is wrong.What I haven't looked at: how we get at the
maximumBitrate
from the RTP stream configuration TLV data sent by homed. This would probably be very useful for plug-ins to know. Right now since plug-ins are interpreting what's actually the min bite rate as the max bit rate, camera plug-ins are usually universally sending much lower quality video to HomeKit than they should be, which is quite sad.