Vanilagy / webm-muxer

WebM multiplexer in pure TypeScript with support for WebCodecs API, video & audio.
https://vanilagy.github.io/webm-muxer/demo
MIT License
197 stars 12 forks source link

Is it possible to get file buffer before .finalize() is called? #2

Closed pie6k closed 1 year ago

pie6k commented 1 year ago

First of all - thank you for creating this amazing lib! I'm going to use it in the https://screen.studio rendering & encoding pipeline.

In my pipeline, I need to transcode the .webm file into .mp4 (I hoped the vp9 codec could be used directly in .mp4 without transcoding, but it will not play on QuickTime on macOS).

What I can do is wait for the .webm file to be ready and then start transcoding. This will work, but as export speed is critical for me, I'd like to already start transcoding even before the .webm video file is ready (aka all video chunks being added).

Thus my question is - is it possible to get a file data buffer while I'm adding video chunks so I can already pass it to ffmpeg? This would allow me to parallelize encoding .webm and transcoding it to mp4.

Thank you!

Vanilagy commented 1 year ago

As it stands, this is not possible because the buffer contents are not written in a straight-forward, linear fashion. I often jump back to set bytes I allocated previously, and at the end when calling .finalize, I jump back to the very start of the file to set some sizes and pointers of information I can only know at the end. Here:

CleanShot 2023-01-12 at 01 13 26@2x

Especially SeekHead and Cues are sections of the file that I can't know beforehand, in addition to the size of the entire Segment.

That said, WebM is designed to be a streaming format also, it just needs to write things slightly differently knowing that it won't be able to go back in time to change past bytes - so it excludes elements such as SeekHead and Cues. Maybe I'll get around to adding it to this library! Shouldn't be all too hard.

If I recall correctly, this library does the muxing in a streaming way, but it's harder to use and requires some WASM - not optimal. In general, having to encode twice is also not optimal. Ideally, we'd go straight from the video frames to the .mp4, but there doesn't seem to be a good muxing library for that yet.

If it's not too hard, maybe I'll create a sister library to this one that spits out .mp4 instead. But no promises!

pie6k commented 1 year ago

Thank you for detailed write up!

I was worried it will be exactly as you said. I understand, it makes total sense - I had no knowledge about how linear or not writing to WebM is

Perhaps I'll create a few shorter webm videos and then merge them together with ffmpeg. This way I could still have some parallelism

I was also hoping I could send video chunks directly to ffmpeg without wrapping it with WebM , but I didn't really find a way to do this

Also, do you have any clues how could I avoid transcoding to mp4 h264 codec, while having mp4 that works across all platforms?

pie6k commented 1 year ago

Also - if you feel you could create mp4 muxer that would have similar performance and could work with VideoEncoder directly, without the need of transcoding - I'd be happy to pay you for that in form of freelance project

Critical part is never having to read pixels data from GPU, but I imagine you're very aware of that

Vanilagy commented 1 year ago

So, ffmpeg actually supports direct video codecs as input (for example an h264 stream), or even uncompressed video in RGB or YUV format. I haven't used these features myself, but I'm sure you could find something. Then you could use a VideoEncoder, get H.264 out, and pipe that directly into ffmpeg. This might be fine for you as a solution.

However, ffmpeg is quite heavy and large and I'd personally hesitate to include it with any website where size is a concern. I'm pretty sure I could write an mp4 muxer that works like this one and gets input directly from a VideoEncoder while being only a couple of kB large. I'd be willing to get to work on it if you were to pay me, so in that case, we should get in contact.

pie6k commented 1 year ago

Oh is it? So I can directly pipe h264 video chunks as -i input? Do I need some special input parameters to get it to work?

eg from config like ? aka. no muxing is needed first?

{
    codec: "avc1.42001E",
    width: 720,
    height: 480,
    bitrate: 1000000,
    avc : { format: "annexb" },
    framerate: 60,
    hardwareAcceleration : "prefer-software",
  }
Vanilagy commented 1 year ago

I'm not sure if it needs extra parameters! I've read you need to specify the format as -f h264, besides that, I don't know. You definitely don't need to mux it if you only have video!

Also, I would not use prefer-software.

pie6k commented 1 year ago

The point is I get not supported in Chrome if I use prefer-hardware.

I'll check it and if I'll need muxing and it'll have reasonable performance and quality - will get in touch

Vanilagy commented 1 year ago

Alright! I managed to get a VideoEncoder to work today using H.264. Just changed my demo slightly. (Not committed)

Just search the internet for people creating VideoEncoders with an avc codec, there's different codec strings and some might work better than others

yume-chan commented 1 year ago

Previously I also thought about this, even when finalize is not called, I can still get a playable file (but not seekable). Like the MKV files produced by OBS.

However, it's not possible to write files directly to user's file system on Web (https://github.com/WICG/file-system-access/issues/260#issuecomment-1374755464), so there is little point for doing that.

MP4 container can write file metadata at either beginning or end, so it's easier to do the "linear" writing mode, except failed to write metadata causes the whole file to be invalid.

Vanilagy commented 1 year ago

In my personal use I never needed to be able to read "unfinished" files. It's like, I want the whole video or nothing. But I guess there are many more usecases out there! Especially seekability is not something I'd want to sacrifice. Of course, if we're talking streaming, that's a different thing.

Also, can't you use the OPFS to write files in-place? Should the power go out, you can simply restore to the last point in the OPFS and then flush that to the user's disk.

yume-chan commented 1 year ago

My use case is similar to OBS, recoding a stream, and in case of anything bad happens, previously recorded part is still playable.

By using OPFS, I still need to create some UI, and tell the user to go back to my Web app to "retrieve" the file. It feels complex and no native programs work like this.

Vanilagy commented 1 year ago

Yes that's right. It is a limitation of the web platform. If your usecase allows for you to use something like Electron, you could explore better solutions.

Vanilagy commented 1 year ago

Honestly though, knowing that power outs are quite rare, I don't think any user would mind to be prompted to restore a file they care about in the case of one.

pie6k commented 1 year ago

it seems my use case is now more of a memory issue thing:

A few users reported 'out of memory' issues when working with long videos, if I use buffer output

I checked file handle API, but it requires me to use handle I got from file picker -I don't want to show the user the picker, as I save to temp file, but seems there is no way to avoid it when working with this API (I use electron)

Would be nice to have 3rd way of capturing called 'file-manual' that expects you to provide set of callbacks such as saveBytes(bytes,offset), etc. and then it is up to me to correctly manage saving this content to file, while also making it possible to save in non linear way

Vanilagy commented 1 year ago

Yeah, that sounds like a smart addition. I'll add it when I find the time!

Vanilagy commented 1 year ago

Should you still be interested in a high-quality MP4 muxer for your project, I'm still willing to cooperate.

gut4 commented 1 year ago

@Vanilagy Hi. I'm interested in mp4 muxer too. Maybe this can help: https://github.com/vjeux/mp4-h264-re-encode

Vanilagy commented 1 year ago

Details about how to use the streaming target are documented in the README. If there are any issues, please report back to me!

Vanilagy commented 1 year ago

@pie6k FFMPEG no more: https://github.com/Vanilagy/mp4-muxer