video-dev / hls.js

HLS.js is a JavaScript library that plays HLS in browsers with support for MSE.
https://hlsjs.video-dev.org/demo
Other
14.72k stars 2.57k forks source link

Video crashing on Tizen 2023 TV's #6562

Closed actionakin closed 2 months ago

actionakin commented 2 months ago

What version of Hls.js are you using?

1.5.13

What browser (including version) are you using?

Chromium m94

What OS (including version) are you using?

Samsung Tizen 7.0

Test stream

https://mlb-cuts-diamond.mlb.com/FORGE/2024/2024-02/17/981e4101-03fa31e9-8a73f8e8-csvm-diamondx64-asset.m3u8

Configuration

{}

Additional player setup steps

I'm using the latest hls.js demo app with the src param set in the url.

Checklist

Steps to reproduce

  1. Launch a Tizen hosted app using the demo url
  2. Watch the "Real-time metrics" tab
  3. Let the demo play for 10-30 minutes

Expected behaviour

The video should play without interruption

What actually happened?

The "appending" time under "load event" section climbs to 5000ms+, eventually the whole app grinds to a halt and crashes.

Screenshot 2024-07-18 at 7 55 58 PM

Console output

base-stream-controller.ts:635 [log] > [stream-controller]: Buffered main sn: 799 of level 4 (frag:[4792.821-4798.848] > buffer:[4664.693-4798.827])
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSED->IDLE
base-stream-controller.ts:843 [log] > [stream-controller]: Loading main sn: 800 of level 4 (frag:[4798.827-4804.833]) cc: 0 [1-1051], target: 4798.827
base-stream-controller.ts:1958 [log] > [stream-controller]: IDLE->FRAG_LOADING
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4669.183086584264] from the audio SourceBuffer
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4669.183086584264] from the video SourceBuffer
base-stream-controller.ts:449 [log] > [stream-controller]: Loaded main sn: 800 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: FRAG_LOADING->PARSING
transmuxer-interface.ts:394 [log] > [transmuxer.ts]: Flushed main sn: 800 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSING->PARSED
base-stream-controller.ts:1920 [log] > [stream-controller]: Parsed main sn: 800 of level 4 (frag:[4798.827-4804.843])
buffer-controller.ts:573 buffering
base-stream-controller.ts:635 [log] > [stream-controller]: Buffered main sn: 800 of level 4 (frag:[4798.827-4804.843] > buffer:[4670.699-4804.833])
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSED->IDLE
base-stream-controller.ts:843 [log] > [stream-controller]: Loading main sn: 801 of level 4 (frag:[4804.833-4810.839]) cc: 0 [1-1051], target: 4804.833
base-stream-controller.ts:1958 [log] > [stream-controller]: IDLE->FRAG_LOADING
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4675.184578496728] from the audio SourceBuffer
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4675.184578496728] from the video SourceBuffer
base-stream-controller.ts:449 [log] > [stream-controller]: Loaded main sn: 801 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: FRAG_LOADING->PARSING
transmuxer-interface.ts:394 [log] > [transmuxer.ts]: Flushed main sn: 801 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSING->PARSED
base-stream-controller.ts:1920 [log] > [stream-controller]: Parsed main sn: 801 of level 4 (frag:[4804.833-4810.859])
buffer-controller.ts:573 buffering
base-stream-controller.ts:635 [log] > [stream-controller]: Buffered main sn: 801 of level 4 (frag:[4804.833-4810.859] > buffer:[4676.705-4810.839])
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSED->IDLE
base-stream-controller.ts:843 [log] > [stream-controller]: Loading main sn: 802 of level 4 (frag:[4810.839-4816.845]) cc: 0 [1-1051], target: 4810.839
base-stream-controller.ts:1958 [log] > [stream-controller]: IDLE->FRAG_LOADING
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4681.186070409193] from the audio SourceBuffer
buffer-controller.ts:1253 [log] > [buffer-controller]: Removing [0,4681.186070409193] from the video SourceBuffer
base-stream-controller.ts:449 [log] > [stream-controller]: Loaded main sn: 802 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: FRAG_LOADING->PARSING
transmuxer-interface.ts:394 [log] > [transmuxer.ts]: Flushed main sn: 802 of level 4
base-stream-controller.ts:1958 [log] > [stream-controller]: PARSING->PARSED
base-stream-controller.ts:1920 [log] > [stream-controller]: Parsed main sn: 802 of level 4 (frag:[4810.839-4816.853])
buffer-controller.ts:573 buffering

Chrome media internals output

No response

actionakin commented 2 months ago

This is probably an issue related to Tizen's codec support on 2023 but I can't tell. Any insight into what might be going on would be very appreciated. Today I was seeing the playback make it through the entire 1h45min stream but still had 2000-3000ms 'appending' times.

actionakin commented 2 months ago

Some additional notes:

https://cph-p2p-msl.akamaized.net/hls/live/2000341/test/master.m3u8 https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8 https://d2zihajmogu5jn.cloudfront.net/bipbop-advanced/bipbop_16x9_variant.m3u8

robwalch commented 2 months ago

Hi @actionakin, have you tried isolating the issue to a single variant?

For example, if you only load the m3u8 for avc1.640028 @4Mbps is the issue still reproducing? Is it specific to avc1.640029 @6.4Mbps?

actionakin commented 2 months ago

Yes I pinned the level to the lowest bitrate. It lasted a little longer but the issue still occurred.

robwalch commented 2 months ago

Please share the config in the description you are testing with. Note that the demo page can be memory intensive. While it is useful for debugging issues, long-form playback should be tested on a page optimized for viewing.

That being said I am curious to know,

actionakin commented 2 months ago

The issue does not persist in the browser. I've even tried playback in Chrome 94 via SauceLabs and no issues.

Here's our prod config:

 {
    "nudgeOffset": 0.1,
    "maxFragLookUpTolerance": 0.25,
    "highBufferWatchdogPeriod": 5,
    "nudgeMaxRetry": 10,
    "maxBufferLength": 60,
    "maxMaxBufferLength": 60,
    "capLevelToPlayerSize": false,
    "enableWebVTT": true,
    "enableCEA708Captions": true,
    "liveDurationInfinity": false,
    "fragLoadingMaxRetryTimeout": 64000,
    "forceKeyFrameOnDiscontinuity": false,
    "progressive": false,
    "liveSyncDurationCount": 3,
    "initialLiveManifestSize": 1,
    "abrBandWidthFactor": 0.98,
    "abrBandWidthUpFactor": 0.6,
    "enableSoftwareAES": false,
    "fragLoadingMaxRetry": 8,
    "levelLoadingMaxRetry": 6,
    "backBufferLength": 60,
    "maxBufferSize": 60000000,
    "debug": false,
    "startLevel": 3,
    "startPosition": 0
}

I'll report back shortly disabling the worker / backBufferLength

actionakin commented 2 months ago

I set enableWorker: false and backBufferLength: -1 and 🤞 so far it's holding at around 200-400ms. I'm going to continue to test different videos but this may be our solution.

actionakin commented 2 months ago

No luck. I ran it with these configurations in my local build and it started locking up around the 19min mark of playback. I also tried just setting one or the other and I got similar results. Using the demo app, running on the TV, it did seem like I was getting much lower "appending" times but I didn't test for longer than 10min before running tests using our app.

agajassi commented 2 months ago

@actionakin I experienced the same issue on our Tizen 2023s. For us disabling worker corrected the problem. However, our back buffer length and max buffer size settings are lower than yours, so maybe it is the combination of that and enableWorker: false did the trick for us.

You can also try using Samsung's own profiling tool to monitor performance. With your remote control if you enter MUTE 1 1 4 MUTE or MUTE 1 8 3 MUTE small, yellow overlay panel with cpu/memory stats should get displayed.

If I am not mistaken, Samsung 2023s are on Tizen 7.0. There are some improvements in Tizen 8.0's release notes. I wonder if that has anything to do with this not being reproducible on 2024, though I might be wrong about that.

Screen Shot 2024-06-05 at 10 09 38 AM

agajassi commented 2 months ago

@actionakin I have Samsung 2023. Since I've experienced this problem myself in the past and familiar with it, I will see if I can reproduce your issue with your prod config you shared above.

actionakin commented 2 months ago

@agajassi You rock. I tried again yesterday and realized that we had some code that was dynamically overriding our backBufferLength and maxBufferSize. I disabled that and hard coded those values as well as enableWorker: false. I was still experiencing an issue, this time relating to an empty forward buffer. I'll be looking into it more today as well as disabling our custom ABRController

actionakin commented 2 months ago

Update: I've now got our app running with the recommended config changes. I realized that I was still testing while running in development mode with the tizen debugger attached. Now everything is running smoothly and video playback is holding strong. I'm still in the middle of running long tests but it's looking very promising.

agajassi commented 2 months ago

Ohh yes, forgot to mention that you can't have Debugger open for this test, 'cos it adds its own overhead.. If I need the logs for whatever reason, I usually just display them on the screen and close debugger. Glad to hear that it is holding up so far. Let us know if this resolves your issues.

Also keep in mind, that @robwalch optimized Web Worker use and that PR should get released with v1.6.0. You might want to consider upgrading to that release as well later.

actionakin commented 2 months ago

@agajassi Noted about v1.6.0. So far I've played video for about 45 min and I'm seeing steady memory usage. This is the best result I've had in 2 weeks ;)

actionakin commented 2 months ago

Touchdown! We updated the config with

{
    backBufferLength: -1,
    enableWorker: false,
    maxBufferSize: 30000000,
}

Everything is playing smoothly now. Thanks for all the help @agajassi and @robwalch