Closed RavikumarTulugu closed 7 months ago
Are you passing encode a correct AudioData
? The polyfill needs its own AudioData
; it cannot use native WebCodecs' AudioData
.
okay, I am not passing AudioData and passing raw audio buffers similar to native webcodecs. let me try and get back, I copied it from samples , is it the right usage ? can i follow this piece of code.
/* NOTE: This direct-copy (_libavGetData) is here only because built-in
* WebCodecs can't use our AudioData. Do not use it in production code. */
for (const frame of frames) {
encoder.encode(new AudioData({
format: frame.format,
sampleRate: frame.sampleRate,
numberOfFrames: frame.numberOfFrames,
numberOfChannels: frame.numberOfChannels,
timestamp: frame.timestamp,
data: frame._libavGetData()
}));
}
According to the spec, native WebCodecs doesn't let you pass raw audio buffers. You're supposed to pass AudioData
. If Chrome's encode
accepts raw buffers, that's not spec compliant (and news to me)
Everything you've written there looks like you already have an AudioData, not raw buffers. But it's probably not the right kind of AudioData. You cannot mix and match browser AudioData objects with polyfill AudioData objects. If you're using the polyfill's encoder, you need to use the polyfill's AudioData.
sorry, by raw i meant audio data format supported by webcodecs.
is there a utility wrapper or routine to change between webcodecs audioData format to polyfill audio data format ? it would be helpful. i am looking through samples .
There isn't, but you're right that it would be helpful. Out of curiosity, where is your AudioData
coming from? If you have AudioData
because you're already decoding using WebCodecs, then yeah, that's the right conversion to make, but if you're constructing the AudioData yourself, then you should just be constructing the right one. Incurring an extra copy is best avoided.
I have a webaudio pipeline which is being fed by mediastreamtrackprocessor. i am not building AudioData.
Ohhhhh, of course, MSTP. Yeah, that makes sense. Well, unfortunately, for the time being, the answer is that you'll have to convert yourself: copy the data out with allocationSize
and copyTo
, then build a polyfill AudioData
yourself. You could use https://github.com/Yahweasel/libavjs-webcodecs-bridge to kinda-sorta do that for you (copy to a libav.js Frame and back), though that's a bit silly since you don't actually need it in libav.js format either.
The design intent of the polyfill was to, well, polyfill, i.e., to implement this API on browsers that don't have it at all. Little attention was paid to interop with the browser's own implementation of the same API, because that just wasn't the case I particularly anticipated. I'm going to change the title of this issue to reflect this need and consider it an open bug in that vein.
ok great !! let me refer to the bridge and see if I can get my hands on any thing. it would be great if the polyfill encoder and decoder can directly consume webcodecs AudioData and videoFrames.
I'm not particularly inclined to let the polyfill take native WebCodecs AudioDatas and VideoFrames, because those interfaces only give access to the raw data with an extra copy, so it would hide a lot of extra cost to just silently accept them. But yeah, in situations like yours, where that performance penalty is unavoidable, at least an adapter should be available. Perhaps the compromise is an extra initialization option when making an en/decoder that puts it in an "accept native data" mode, so the user at least has to ask to pay the performance penalty instead of having it silently do so.
546a7bc adds these functions, but not done automatically with encode; you have to explicitly convert from native if you're using the polyfill, with LibAVWebCodecsPolyfill.AudioData.fromNative(whatever)
. Should be in a release soonish.
This error is seen right on the first encode call. can this be ignored ??
what does this comment mean ? I am sure my buffer is not detached as I am using the encoder synchronously.