Vanilagy / mp4-muxer

MP4 multiplexer in pure TypeScript with support for WebCodecs API, video & audio.
https://vanilagy.github.io/mp4-muxer/demo
MIT License
419 stars 32 forks source link

Chunking not working properly? #51

Closed darklightblue closed 4 months ago

darklightblue commented 4 months ago

So I have this code.

const muxer = new Mp4Muxer({
  target:  new StreamTarget({
    onData: async (data: Uint8Array, position: number) => {
      console.log(data, position);

      // this function performs a write to file via stream (fs.createWriteStream method)
      window.desktop.writeChunkStream(data);
    },

    // looks like this one is bugged? it doesn't send all the correct data
    chunked: true,
    chunkSize: 8 * 2 ** 20,
  }),
  fastStart: "fragmented",
  firstTimestampBehavior: 'offset',
});

I have a test video which is 2minutes and 48 seconds long. around 110MB

If I remove chunked and chunkSize parameter, everything works correctly, my video is also correctly written without any loss. will generate 2:48 with 110MB

If I add those parameter back, It would build the video but it's an incomplete one. like it would only generate a 12 second video but with the same size 110MB.

Using the logs, I am receiving the data with different position everytime (so I guess it doesn't send an identical data) but I noticed that I am getting 2 sets of data that is not identical in chunkSize even if I specifically added a chunkSize:

image

Vanilagy commented 4 months ago

Hi!

Why aren't you passing position here?

window.desktop.writeChunkStream(data);

The position is just as important as the data itself and has to be respected. It's not an append-only buffer.

darklightblue commented 4 months ago

I see, thanks for the idea.

At first I did not used the position because I was using fs.createWriteStream which does not accept a position argument

const writableStream = createWriteStream(filePath);
writableStream.write(chunk);
...

So after your comment I realized I needed a different approach. So for other people that will have the same question, this is what I had to do:

const writeStreams = {};

function openFile(filePath) {
    fs.open(filePath, 'w', (err: NodeJS.ErrnoException | null, fd: any) => {
      if (err) {
        console.error('Error opening file:', err);
        e.reply(`write-stream-error-${id}`, err.message);
        return;
      }
      writeStreams = { fd, filePath, writePromise: null };

      e.reply(`write-stream-started-${id}`);

      return;
    });
}

function writeChunk(chunk, position) {
    const { fd } = writeStream;

    writeStream.writePromise = new Promise((resolve, reject) => {
      fs.write(fd, chunk, 0, chunk.length, position, err => {
        if (err) {
          reject(err);
          return;
        }
        resolve();
      });
    });
}

async function end() {
    const { fd, writePromise } = writeStream;

    try {
      if (writePromise) {
        await writePromise;
      }
      fs.close(fd, (err: NodeJS.ErrnoException | null) => {
        if (err) {
          return;
        }
        e.reply(`write-stream-ended-${id}`);
      });
    } catch (err: any) {
      console.error('Error ending write stream:', err);
    }
}

call openFile first, then initate mp4 muxer and call writeChunk when you receive the data, then end once everything is done.

Vanilagy commented 4 months ago

Awesome, and thanks for the code. Is the file correct now, even using chunked mode?

darklightblue commented 4 months ago

Awesome, and thanks for the code. Is the file correct now, even using chunked mode?

Yes the file is correct now, I guess my remaining question is why I am getting 2 different chunkSizes in the data despite of the chunkSize that I set?

Vanilagy commented 4 months ago

The chunk with the small size looks like it's out of order, if you add 608 to 66550696 you don't get the position of the next chunk. So it probably corrected some bytes that have already been written before, in which case it doesn't make sense to use the full chunk size just to write 608 bytes.