samirkumardas / jmuxer

jMuxer - a simple javascript mp4 muxer that works in both browser and node environment.
Other
549 stars 108 forks source link

Media resource blob could not be decoded #57

Closed rytec-nl closed 3 years ago

rytec-nl commented 3 years ago

Hi,

I'm trying to stream rtsp cameras through a rust + gstreamer backend, to the browser and play it using jmuxer.

I've gotten it to a point where a first couple of frames appear in the video element but then it stop with the following error:

Media resource blob:http://localhost:3000/c75f8714-b948-4f15-a675-240fd1cfe08a could not be decoded.
Media resource blob:http://localhost:3000/c75f8714-b948-4f15-a675-240fd1cfe08a could not be decoded, error: Error Code: NS_ERROR_DOM_MEDIA_FATAL_ERR (0x806e0005)

this is with the following config in the browser in firefox 80.0 (chromium does not give any error but video freezes as wel):

var jmuxer = new JMuxer({
  node: 'camera',
  mode: 'video',
  debug: false,
  flushingTime: 2,
  clearBuffer: false,
  fps: 30,
});
var ws = new WebSocket("ws://127.0.0.1:9000");

ws.binaryType = 'arraybuffer';

ws.addEventListener("message", (msg) => {
  jmuxer.feed({{  
    video: new Uint8Array(msg.data),
  });
});

If I set clearBuffer to true I also get these errors (alot):

Uncaught DOMException: An attempt was made to use an object that is not, or is no longer, usable jmuxer.js:2100
    initCleanup http://localhost:3000/jmuxer.js:2100
    clearBuffer http://localhost:3000/jmuxer.js:2506
    interval http://localhost:3000/jmuxer.js:2456
    (Async: setInterval handler)
    startInterval http://localhost:3000/jmuxer.js:2452
    JMuxmer http://localhost:3000/jmuxer.js:2230

Is this something you recognize? I'm a bit worried it could be the h264 encoding itself (not the best camera's) however I have gotten them streaming pretty reliable using hls.js. So a browser should be able to play it.

If you need any more info let me know,

Thanks in advance!

rytec-nl commented 3 years ago

I did some further testing and if I increase the flushingTime, from about 20 seconds it starts to play a lower resolution stream (I think 720p). If I take the full resolution (2K) it hangs with the same error. However it also takes about 20 seconds for it to start playing (the start delay is always about the flushingTime). Looking at the source code this does not make sense to me, as this time is only used for a interval to drop the buffers from an array. Am I missing something? (I did not see doAppend())

As a side note:

In the README it says flushingTime - Buffer flushing time in seconds. Default value is 1500 miliseconds. This made me set it to 2 in my first post, instead of 2000 for two seconds.

samirkumardas commented 3 years ago

@reyapo it seems it gets a partial frame in each chunk in case of high-resolution video. Right now, jmuxer expects a chunk (every time you feed the buffer) that contains NALus of a full-frame.

Could you please provide some portion of the h264 buffer so I can test? Or you can play it using this player https://samirkumardas.github.io/jmuxer/h264_player.html

If it works fine in h264 player, then it must be an issue of partial frame issues that I mentioned

rytec-nl commented 3 years ago

It indeed looks like I'm sending partial frames! If I record a bit and use your h264 player it works flawlessly.

Perhaps I can take a stab at buffering partial frames. Is that something you would like? My javascript is not great (I mostly program c++ for my job), but if I can get something functional its a start at least :)

samirkumardas commented 3 years ago

You have to make sure that you are feeding buffer of full frames. Therefore, before feeding into jmuxer, you can wait and take buffer until all Nalus are accumulated to complete a frame. I know it is a little complex to achieve as it requires parsing NAL units again outside of the JMuxer.

In our case, we would make sure this from our server-end. So at the JMuxer end, we would simply feed them before doing any kind of processing.

Anyway, I will add this feature inside JMuxer, in fact, you could say it is a bug of it. But unfortunately, I don't have an exact timeframe, I will add it as soon as I get some time.

good luck!

rytec-nl commented 3 years ago

I have a working solution for my situation by modifying the jmuxer client. The issue was indeed that I was sending every NAL unit in a separate message, so I had to group them together again. I copied your chunk extractor logic from server-h264.js and made a new ChunkBuffer class and used it in the feed function. I tried to made it configurable but its not quit there yet as a final solution. I haven't tested audio syncing and I broke the duration parameter.

Here are the diff in case you have use for them:

diff --git a/src/parsers/h264.js b/src/parsers/h264.js
index e0351a7..40c1629 100644
--- a/src/parsers/h264.js
+++ b/src/parsers/h264.js
@@ -1,7 +1,41 @@
-import { ExpGolomb } from '../util/exp-golomb.js';
-import { NALU } from '../util/nalu.js';
+import {ExpGolomb} from '../util/exp-golomb.js';
+import {NALU} from '../util/nalu.js';
 import * as debug from '../util/debug';

+export class ChunkBuffer {
+    constructor(props) {
+        this.props = props;
+        this.videoBuffer = [];
+        this.audioBuffer = [];
+        this.chunk = [];
+        this.minNaluPerChunk = 30;
+    }
+
+    sink(data) {
+        // let duration = data.duration ? parseInt(data.duration) : 0;
+        if (data.video) {
+            let nal_units = H264Parser.extractNALu(data.video);
+
+            for (let nal_unit of nal_units) {
+                let ntype = nal_unit[0] & 0x1f;
+                this.chunk.push(nal_unit);
+                if (this.chunk.length >= this.minNaluPerChunk && ntype !== 1 && ntype !== 5) {
+                    this.videoBuffer.push(this.chunk);
+                    this.chunk = [];
+                }
+            }
+
+            if (this.videoBuffer.length > this.props.bufferSize) {
+                this.props.src(this.videoBuffer.shift(), this.audioBuffer.shift());
+            }
+        }
+
+        if (data.audio) {
+            this.audioBuffer.push(data.audio);
+        }
+    }
+}
+
diff --git a/src/jmuxer.js b/src/jmuxer.js
index 7273a13..5bfa3f3 100644
--- a/src/jmuxer.js
+++ b/src/jmuxer.js
@@ -1,7 +1,7 @@
 import * as debug from './util/debug';
-import { NALU } from './util/nalu.js';
-import { H264Parser } from './parsers/h264.js';
-import { AACParser } from './parsers/aac.js';
+import {NALU} from './util/nalu.js';
+import {ChunkBuffer, H264Parser} from './parsers/h264.js';
+import {AACParser} from './parsers/aac.js';
 import Event from './util/event';
 import RemuxController from './controller/remux.js';
 import BufferController from './controller/buffer.js';
@@ -25,7 +25,8 @@ export default class JMuxmer extends Event {
             clearBuffer: true,
             onReady: null, // function called when MSE is ready to accept frames
             fps: 30,
-            debug: false
+            debug: false,
+            bufferSize: 1
         };
         this.options = Object.assign({}, defaults, options);

@@ -43,22 +44,40 @@ export default class JMuxmer extends Event {
         this.frameDuration = (1000 / this.options.fps) | 0; // todo remove

         this.node = typeof this.options.node === 'string' ? document.getElementById(this.options.node) : this.options.node;
-
+
         this.sourceBuffers = {};
         this.isMSESupported = !!window.MediaSource;
-
+
         if (!this.isMSESupported) {
             throw 'Oops! Browser does not support media source extension.';
         }

+        this.chunkBuffer = new ChunkBuffer({
+            bufferSize: this.options.bufferSize, src: (video, audio) => {
+                let chunks = {
+                    video: this.getVideoFrames(video, 0),
+                    audio: []
+                };
+
+                if (audio) {
+                    let slices = AACParser.extractAAC(audio);
+                    if (slices.length > 0) {
+                        chunks.audio = this.getAudioFrames(slices, 0);
+                    }
+                }
+                console.log('remux');
+                this.remuxController.remux(chunks);
+            }
+        });
+
         this.setupMSE();
-        this.remuxController = new RemuxController(this.options.clearBuffer);
+        this.remuxController = new RemuxController(this.options.clearBuffer);
         this.remuxController.addTrack(this.options.mode);

         this.mseReady = false;
         this.lastCleaningTime = Date.now();
         this.kfPosition = [];
-        this.kfCounter  = 0;
+        this.kfCounter = 0;

         /* events callback */
         this.remuxController.on('buffer', this.onBuffer.bind(this));
@@ -76,35 +95,8 @@ export default class JMuxmer extends Event {
     }

     feed(data) {
-        let remux = false,
-            slices,
-            duration,
-            chunks = {
-                video: [],
-                audio: []
-            };
-
         if (!data || !this.remuxController) return;
-        duration = data.duration ? parseInt(data.duration) : 0;
-        if (data.video) {
-            slices = H264Parser.extractNALu(data.video);
-            if (slices.length > 0) {
-                chunks.video = this.getVideoFrames(slices, duration);
-                remux = true;
-            }
-        }
-        if (data.audio) {
-            slices = AACParser.extractAAC(data.audio);
-            if (slices.length > 0) {
-                chunks.audio = this.getAudioFrames(slices, duration);
-                remux = true;
-            }
-        }
-        if (!remux) {
-            debug.error('Input object must have video and/or audio property. Make sure it is a valid typed array');
-            return;
-        }
-        this.remuxController.remux(chunks);
+        this.chunkBuffer.sink(data);
     }

     getVideoFrames(nalus, duration) {

Is this something you want to continue with in a pull request? Or should I leave it like this.

samirkumardas commented 3 years ago

@reyapo Thank you. I have to check it with other cases. I will let you know if I need more information from you.