scottlamb / moonfire-nvr

Moonfire NVR, a security camera network video recorder
Other
1.22k stars 137 forks source link

live view not working on iPhone #121

Closed IronOxidizer closed 2 weeks ago

IronOxidizer commented 3 years ago

Describe the bug An error page is displayed after selecting a live view stream.

e@http://192.168.1.64:8080/static/js/main.7d8533f5.chunk.js:1:29092
http://192.168.1.64:8080/static/js/main.7d8533f5.chunk.js:1:29576
Al@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:348325
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:366138
bl@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:339244
bl@[native code]
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288719
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:366138
qa@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288665
Ya@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288600
Ne@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:359949
Qt@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:267224
Qt@[native code]

Haven't been able to reproduce the error on Windows, Linux, or Android with any browser.

To Reproduce Steps to reproduce the behavior:

  1. Go to web UI on iOS
  2. Click on Live View
  3. Select camera stream
  4. See error

Expected behavior

  1. Select camera
  2. Live view of camera appears

Server (please complete the following information):

Screenshots image

Smartphone

scottlamb commented 3 years ago

Do you know if it's possible to enable source maps to get a more useful backtrace? I don't have an iPhone or know how to debug on them. I do have a Mac laptop; there's probably an iPhone simulator or something but I've never tried before.

I think it's also possible to apply source maps after the fact, so I guess that's the next step if Mobile Safari doesn't support map files.

It's probably something minor given that this works on desktop Safari.

IronOxidizer commented 3 years ago

Took some time but I found a way to get console output on iOS (I don't own any Apple devices, this iPhone isn't even mine so debugging with broken tooling took hours).

DEBUG/api/?days=true: 200
ERRORReferenceError: Can't find variable: MediaSource
ERRORUncaught error: ReferenceError: Can't find variable: MediaSource [object Object]

https://github.com/scottlamb/moonfire-nvr/blob/8465b49cfab5561886c9ccea848cb38ef6907ba1/ui/src/Live/LiveCamera.tsx#L129

This seems like the only instance of MediaSource from what I can tell, and it seems like a well known issue. Seems like iOS doesn't support MSE so not sure how to handle this.

scottlamb commented 3 years ago

Thanks for tracking that down!

Hopefully it isn't as bad as MSE not being supported at all. My whole approach for live view and the planned scrub bar UI depends on it.

https://caniuse.com/?search=mediasource is a bit more mixed. At the top for "Media Source Extensions" for "Safari on iOS" it says "~ Partial Support ... fully supported on iPadOS 13 and later." Kind of vague. Under "MediaSource API", "Safari on iOS" is in green. And I see now a subtest https://caniuse.com/mdn-api_mediasource_istypesupported which says MediaSource.isTypeSupported is supported. So does https://developer.mozilla.org/en-US/docs/Web/API/MediaSource/isTypeSupported . Maybe I'm holding it wrong?

scottlamb commented 3 years ago

Maybe all the stuff saying it's supported means only on iPads.😱

scottlamb commented 3 years ago

I guess the alternatives are HLS or WebRTC. There is a shiny new Rust library for WebRTC. Maybe that's the way to go. https://github.com/webrtc-rs/webrtc

IronOxidizer commented 3 years ago

One thing to note about going for a WebRTC approach is that we might need a different implementation depending on whether the client is on the same network or not. If the client is not on the same network, we would use the normal method of negotiating a connection via STUN servers. If the client is on the same network and we don't have internet connectivity (closed network), we would have to manually send SDP between the client and the server, probably with AJAX.

Maybe I'm over complicating it but this is an issue I previously had when working with WebRTC on either LAN or over the internet.

scottlamb commented 3 years ago

One thing to note about going for a WebRTC approach is that we might need a different implementation depending on whether the client is on the same network or not.

Would we still need STUN if the server is Internet-accessible (not behind NAT or using port forwarding)? It'd be nice to avoid that complexity. I don't really know anything about WebRTC yet or have any experience with it.

I also took a quick look into the the HLS approach. It doesn't look too hard if we can use fragmented .mp4 files. We already have (extensive) logic for generating those. There's also a proof-of-concept Rust crate called lowly which is written by the same author as the h264-reader crate we're using. I don't think we'd use lowly directly for several reasons (eg it uses ffmpeg to generate its .mp4 files where we have our own logic) but it might be a place to look to for inspiration.

I'm still grieving the inability to use MSE on all platforms. That was how I'd planned to do the scrub bar stuff. WebCodecs would be even better but I know that won't be viable for a long time. I wonder if Apple will add support for MSE on iPhone soon, given that they apparently support it on iPad. It seems so strange they'd support it in one place but not the other.

scottlamb commented 3 years ago

Updating the title to reflect that I think this only happens on iPhone; iPads apparently do support MSE like desktop Safari.

The version I'm about to release will at least give a more helpful error message, directing to this issue.

I'm thinking now the best course of action might be to have the Javascript UI create an adapter between the current HTTP API and the HLS approach that iPhones apparently need. Then we could keep pushing live segments immediately over the WebSocket rather than switching to polling.

hn commented 5 months ago

I'm not much into these frontend things, but the new Managed Media Source might be an (sources say easy to implement) option to resolve this issue (available on iPhones iOS 17+). More info e.g. here or here.

scottlamb commented 4 months ago

Thanks for the pointer! Some progress:

  1. It turns out there's an iPhone simulator available on macOS as part of Xcode, so I'm able to actually test this even though I don't have access to a physical iPhone.
  2. That webkit.org link had a key paragraph that I missed. I figured it out by trial and error with code samples instead, and just noticed it now, after the fact.

    Note that support for Managed Media Source is only available when an AirPlay source alternative is present, or remote playback is explicitly disabled.

  3. I'm able to switch from MediaSource to ManagedMediaSource on Safari/macOS (basically a no-op, as MediaSource was working fine), but it's still not working on iPhone. The problem is window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"') returns false, and even if I disable that check, I don't see the video actually show up. I'm not sure what the problem is. I wondered if it was the container format. I could add support to Moonfire for MPEG-TS, but window.ManagedMediaSource.isTypeSupported('video/mp2ts; codecs="avc1.4D401E"') also returns false, so there's probably no point. image
scottlamb commented 4 months ago

On second thought, this might be a simulator problem. I wonder if window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"') would return true on a physical iPhone.

hn commented 4 months ago

The most complicated Javascript code I've ever written (because I've never written any JS before):

foo = window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"')
bar = window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="nonsense"')

console.log(foo)
console.log(bar)

Returns foo=true and bar=false on an physical iPhone iOS 17.4 (and as expected some TypeError on desktop MS Edge)

scottlamb commented 4 months ago

That's encouraging! My change may be enough then.

hn commented 4 months ago

After downloading a zillion fancy build somethings I can say ... hooray, it (the iphone branch) works :)

I see all my cameras e.g. in 2x2 view, which is great. But ... they do not refresh (still images). If I select (change) a camera in one of the dropdowns the video comes to foreground and refreshes constantly (live stream). It switches to still image if I stop the foreground mode. Disclaimer: all based on a quick 5 minute test.

scottlamb commented 4 months ago

Progress! Is there anything useful written in the Javascript console?

hn commented 4 months ago

As far as I can see, there is no easily accessible javascript console on the iPhone. You have to connect the phone to a computer via cable and debug there. I can't say at the moment when I will have time to do this.

Pure guesswork: Possibly the behavior with the still images is not a bug, but the intention of the "Managed" Media Source so that no battery power is wasted. Perhaps it is possible to set some kind of priority flag, which will cause the videos to play live?

In any case, the current state of the code is already a good usable step forward.

scottlamb commented 4 months ago

As far as I can see, there is no easily accessible javascript console on the iPhone. You have to connect the phone to a computer via cable and debug there. I can't say at the moment when I will have time to do this.

I think you're right; it's a similar experience for Chrome/Android.

Pure guesswork: Possibly the behavior with the still images is not a bug, but the intention of the "Managed" Media Source so that no battery power is wasted. Perhaps it is possible to set some kind of priority flag, which will cause the videos to play live?

That doesn't match my understanding of how it's supposed to work. The startstreaming and endstreaming events are hints; you can totally ignore them.

This part of the Moonfire UI still says (experimental) because I know there are bugs/omissions in my handling in general. I've seen error messages in Firefox still; if you put your device to sleep then wake it up (or maybe even just tab away and back), you can end up with several seconds of stale video in the buffer; etc. So I wouldn't be surprised if I have more work to do. I'm just happy it seems to be fundamentally possible to make this API work on iPhone where it didn't before. Not having one API that would work on all devices really took the wind out of my sails in terms of developing a nice UI for Moonfire.

IronOxidizer commented 4 months ago

Great to see progress being made on this!

As far as I can see, there is no easily accessible javascript console on the iPhone.

You can enable log collection using chrome://inspect as mentioned in https://blog.chromium.org/2019/03/debugging-websites-in-chrome-for-ios.html

Enable JavaScript log collection by navigating to chrome://inspect in Chrome for iOS and leaving that tab open to collect logs. In another tab, reproduce the case for which you are interested. Then switch back to the chrome://inspect tab to view the collected logs. (Log collection will stop if the chrome://inspect page closes or navigates and logs will be lost as they are not persisted.)

Curid commented 4 months ago

I have a pure Rust LL-HLS server if you're interested, don't know if it works with Safari though.

scottlamb commented 2 weeks ago

I bought a used iPhone SE 3rd gen (2022) for developing Moonfire NVR and other projects.

I see all my cameras e.g. in 2x2 view, which is great. But ... they do not refresh (still images). If I select (change) a camera in one of the dropdowns the video comes to foreground and refreshes constantly (live stream). It switches to still image if I stop the foreground mode. Disclaimer: all based on a quick 5 minute test.

I can see exactly this behavior for myself now. And I think the solution is really simple: add a playsinline attribute, e.g. <video playsinline>.

The same full-screen behavior / auto-pause when you leave it seems to be present for the list view stuff. But it's less problematic there because that uses <video controls> and so you can use those controls to unpause it. The live view doesn't offer a way to do that, so without playsinline once you exit out of the full-screen thing, it just stalls until you switch to another camera.

I think that will bring iPhone up to parity with other devices (although hardly perfection yet).

hn commented 2 weeks ago

I am happy to confirm that the live-view here now also works as intended!