Open tlecoz opened 4 years ago
i'm search this example too.
Yeah deffo agree but the developer is π΄π΄π΄π΄π΄
Did any of you figure out how to do this?
@terkelg I made a library webm-muxer
which works perfectly with the WebCodecs API. If you give me some time (or money), I'll also drop mp4-muxer
.
@terkelg I made a library
webm-muxer
which works perfectly with the WebCodecs API. If you give me some time (or money), I'll also dropmp4-muxer
.
Please do it ! I give you all the time you need ! π
It takes a while man! And I need reasons to work on it
@terkelg @tlecoz @daxiajunjun Here y'all go:
https://github.com/Vanilagy/mp4-muxer <3
If you like it, make sure to share it so we can gain some traction on this one :)
Thank you so much ! It looks perfect !
I think it could be usefull to add a noisy audio buffer to your demo in order o get an example as complete as possible for beginners :)
Thank you again !
@tlecoz What do you mean by a noisy audio buffer? Oh as in, give a demo where audio isn't extracted from a media stream, but from an AudioBuffer?
yes exactly ! By "noisy" I mean an AudioBuffer filled with random values, just for the example
Thanks again for your great work !
Mh. I'm wondering if it's the right place to include that, since that's technically something about WebCodecs and not necessarily my library. But I see how it could be useful for a beginner π€ Just can't think of an elegant way to include it without bloating the demo code.
Have you looked into AudioData on MDN? It's not too complicated!!
That's probably why no one never add an exemple that handle audio ^^ I found different solution to generate a mp4 but there is no example, anywhere, with or without webcodec that handle video with audio.
I'm not a beginner, I didn't try your library yet but I saw that you handled audio channel and I m confident enought that I will success to use it. But for a beginner , I'm almost sure it would be hard because of the lack of examples.
chatGPT can't help the world with that because WebCodecs was not completed in september 2021.
I don't want to bother you at all. I'm very gratefull for your work.
But I'm sure it would be usefull :) Maybe you could add an example somewhere in the source, not in the "official demo"
Thank you again ! :)
EDIT : when I said 'I'm not a beginner", I mean "I'm not a beginner as programer", I'm a total noob at muxing ^^
Haha yes I know what you mean! I personally was super dissatisfied with the state of available muxers, which is why I made my own! And I never understood why so many muxers only supported video, as if the devs don't have ears (Sorry, getting a little ranty here).
What I meant to say is that muxing doesn't directly have to do with WebCodecs... my libraries, in a way, assume a bit of knowledge about WebCodecs and how to use the API. But I see how a beginner in this space might want a full solution, which is why I added the demos.
To give you a quick primer on how to encode an AudioBuffer: Say you have an AudioBuffer from some source (maybe from an OfflineAudioContex, from decodeAudioData or you made one yourself). What you then want to do is create an instance of AudioData from that:
// Let's assume 2 channels!
let data = new Float32Array(2 * audioBuffer.length);
data.set(audioBuffer.getChannelData(0), 0);
data.set(audioBuffer.getChannelData(1), audioBuffer.length);
let audioData = new AudioData({
format: 'f32-planar',
sampleRate: audioBuffer.sampleRate,
numberOfFrames: audioBuffer.length,
numberOfChannels: 2,
timestamp: 0, // When the audio plays, like timestamp on VideoFrame
data
});
As you can see, I'm creating a Float32Array which I'm then filling with all of the audio samples. The way I'm arranging both channels here is that I first put all of the data for channel 1, then all of the data for channel 2. This is called a "planar" arrangement. This is opposite to "interleaved", which means the samples take turns (one sample from channel 1, one from 2, one from 1, etc...).
Then, all we need to do is to pass audioData
to an AudioEncoder!
// Assume we have audioEncoder, setup for that is in the demo
audioEncoder.encode(audioData);
This is basically what I do in Marble Blast Web, with a bit of resampling logic on top: Code
Again, keep in mind this here is a general tutorial on the WebCodecs API, and has nothing to do with my muxers directly!
Thank you so much for your help ! It sounds very simple ! I will try to do something tonight !
Your "Marble Blast" game is impressive ! It was one of my favorite game when I was a kid :) Your code is very inspiring, I will look at your repos carefully :)
Thank you again !
Thank you! Marble Blast was the first video game I ever played, and ultimately the first thing that got me using computers!
Thanks to your amazing work I 've been able to make the tool I wanted very quickly ! https://github.com/tlecoz/Mp4Maker
Thank you again !!!
Hello,
WebCodec API introduce the ability to encode & decode video directly in the browser but doesn't provide any way to natively mux the data and create a mp4 file.
You can see it here : https://github.com/WICG/web-codecs/blob/master/explainer.md (I'm speaking about the "Example of transcoding or offline encode/decode" )
I know how to do what I want using FFMPEG but I would like to not use it at all.
Few day ago, my boss asked directly one of the engineer from Google in charged of WebCodec API about the correct way to output a mp4 file and he told us that WebCodec doesn't support these feature but some external library does (and he quoted mux.js)
The problem is I'm at all an expert at muxing. To be honest, I didn't even know what was the meaning of "mux" this morning and obviously I have no idea how to use your library to do what I want.
Now that VideoEncoder class exists, the ability to output a mp4 file in the browser seems obvious and I think a working example that show us (the world) how to implement it would be widely appreciated