Open JohnWeisz opened 6 years ago
The current version does not support offline rendering, but it would be a cool feature, and actually quite straight forward to implement: render function request a new buffer from the worker, and the response appears in onRender (as this.outputBus[0][0][curbuf]
). of course, render method needs to be promoted to public, and audioContext.currentTime needs to be replaced with a variable that is updated for each render call.
then offline rendering could be achieved with a main thread code like
function renderOffline(endtime, callback) {
var curtime = 0;
var buffers = [];
// polyfill should invoke this from onRender
node.onBufferReceived = (buf) => {
buffers.push(buf.slice()); // maybe slice() is not required
curtime += bufduration;
if (curtime < endtime) {
postMessagesToWorklet() // MIDI, params, etc
node.render(curtime);
}
else callback(buffers);
}
node.render(curtime);
}
I've been reading up this project's description (as well as the code itself), and first I'd like to say I appreciate the effort behind it.
However, I saw that it uses dedicated workers for audio processing, and passes audio data back and forth. Which is async.
So my question is: how does this work with OfflineAudioContext, in which the
onaudioprocess
handler can be called faster than time progresses? Does it work at all?