Closed juj closed 1 year ago
Direct exposure of MessageChannel
would be useful. fetch()
and WebTransport
would also be useful. I have no evidence that either of the above woul impact AudioWorkletProcessor
performance. See also
An alternative that does not meet the specific requirement, as some further dynamic creation of single-use module (scripts; Worker
; ServiceWorker
) is utilizing the fact import
is possible in AudioWorkletGlobalScope
new Worker('./worker.js', {
type: 'module'
}); // the last time I check Firefox had no yet implemented module type Worker
// worker.js
export const data = new ArrayBuffer(1);
// AudioWorkletGlobalScope
import { data } from './worker.js';
console.log(globalThis, data);
I filed the above two issue because we have TransferableStreams, which we can process in AudioWorkletProcessor
scope, however, there is no way other than MessagePort
, that I found through experimentation myself, to get dynamic data, e.g., streams of PCM to the AudioWorkletGlobalScope
itself.
AudioWG call:
AudioWorkletProcessor
that is only used for message passing is not uncommon in large web apps.MessagePort
on the scope.Adding onmessage
, postMessage()
will definitely be useful. I see no difference between exposing the event handler and method in AudioWorkletGlobalScope
and AudioWorkletProcessor
.
For my use cases the question is: How do we get streaming audio data to AudioWorkletGlobalScope
and AudioWorkletProcessor
?
While we can post messages via MessagePort
we should also just be able to fetch()
in either scope. Technically import
fetches, as evinced by onfetch
event handler dispatched for import
. I see no reason here to not expose fetch()
in AudioWorkletGlobalScope
so that users can stream data directly, instead of routing through a Worker
or main thread via MessagePort
. Less steps. Less cumbersome than workarounds.
This proposal https://github.com/tc39/proposal-import-assertions#worker-instantiation could extend to AudioWorkletGlobalScope
, which is already an Ecmascript module, so we could do something like
import wasm from './memory.wasm' assert {type: 'webassembly'};
I have not yet found a way to use structedClone()
to transfer WebAssembly.Memory
using the following approach, though we can export functions.
I am still testing; and so far I have only tested creating a Blob URL - using strings. I have not yet tried caching a Request
for WebAssembly.Memory
, etc., and serving that in a Response
- which should be possible for the use case at OP.
What we can do is import JSON and data that can be serialized, if you can get the data you need in JSON format.
worker.js - Intercept import
in ServiceWorker
from AudioWorkletGlobalScope
using onfetch
; the ModuleSpecifier
(exports.js) file does not have to exist.
self.addEventListener('install', (event) => {
event.waitUntil(self.skipWaiting());
});
self.addEventListener('activate', async (event) => {
event.waitUntil(self.clients.claim());
});
self.addEventListener('message', async (event) => {
console.log(event);
});
function mod(data = 123) {
console.log(data);
}
self.addEventListener('fetch', async (event) => {
if (
event.request.destination === 'audioworklet' &&
event.request.url.includes('exports.js')
) {
event.respondWith((async () => {
const request = await fetch('./test.wav');
const buffer = await request.arrayBuffer();
const json = JSON.stringify([...new Int8Array(buffer)]);
const data = `const wav = ${json}; const mod = ${mod.toString()}; export {wav, mod};`;
const blob = new Blob([data], {
type: 'text/javascript',
});
return new Response(blob);
})());
}
});
test.json
{
"mod":123
}
audioWorklet.js
import { wav, mod } from './exports.js';
import json from './test.json' assert { type: "json" };
console.log(json);
// throws right now on Chromium 101
// import wasm from './memory.wasm' assert {type: 'webassembly'};
// console.log(wasm);
mod(); // imported function
const data = new Int8Array(wav).subarray(44); // imported 1 channel, 22050 sample rate, 16 bits per sample WAV file
let offset = 0;
console.log(globalThis, data);
class AudioWorkletStream extends AudioWorkletProcessor {
constructor(options) {
super();
console.log(this, options);
}
process(inputs, outputs) {
if (offset >= data.length) {
this.port.postMessage('');
return false;
}
const [channel] = outputs.flat();
const int8 = new Int8Array(channel.length * 2);
for (let i = 0; i < int8.length; i++, offset++) {
if (offset >= data.length) {
break;
}
int8[i] = data[offset];
}
const int16 = new Int16Array(int8.buffer);
for (let i = 0, j = 0; i < int16.length; i++) {
const int = int16[i];
// If the high bit is on, then it is a negative number, and actually counts backwards.
const float = int >= 0x8000 ? -(0x10000 - int) / 0x8000 : int / 0x7fff;
channel[j++] = float;
}
return true;
}
}
registerProcessor('audio-worklet-stream', AudioWorkletStream);
script.js - Unique "name" ('./worker.js?=' + new Date().getTime()
) for ServiceWorker
is important - particularly for single-use import
and fetch()
intercepting where the ServiceWorker
is unregistered following completion of the required task and is not expected to do anything else ServiceWorker does not unregister on localhost without query component in scriptURL.
try {
if (globalThis.gc) {
gc();
}
document.querySelector('button').onclick = async () => {
for (const reg of await navigator.serviceWorker.getRegistrations()) {
await reg.unregister();
}
const sw = await navigator.serviceWorker.register(
'./worker.js?' + new Date().getTime(),
{
scope: './',
}
);
const ac = new AudioContext({
latencyHint: 0,
sampleRate: 22050,
});
await ac.audioWorklet.addModule('./audioWorklet.js');
const aw = new AudioWorkletNode(ac, 'audio-worklet-stream', {
numberOfInputs: 0,
numberOfOutputs: 1,
outputChannelCount: [1],
processorOptions: {},
});
aw.onprocessorerror = (e) => {
console.error(e);
console.trace();
};
aw.port.onmessage = async (e) => {
console.log(e.data);
aw.disconnect();
await ac.close();
await sw.unregister();
};
aw.connect(ac.destination);
if (ac.state === 'suspended') {
await ac.resume();
}
};
} catch (e) {
console.error(e);
throw e;
}
index.html
<!DOCTYPE html>
<html>
<head>
<title>
</title>
</head>
<body>
<button>Play</button>
<script type="module" src="./script.js">
</script>
</body>
</html>
Thanks for filing this issue.
Another option is to import
the wasm file. I don't know if that is feasable for your use cases.
Much simpler is defining onmessage
and postMessage()
in AudioWorkletGlobalScope
, with consideration and provision for await clients.matchAll({includeUncontrolled: true})
in ServiceWorker
and some means to call navigator.serviceWorker.onmessage = async (e) => {}
in AudioWorkletGlobalScope
.
We can import asm as base64, or through ServiceWorker
.
wasm.js
export const wasm = 'AGFzbQEAAAABBwFgAn9/AX8CCwECanMDbWVtAgABAwIBAAcOAQphY2N1bXVsYXRlAAAKMgEwAQJ/IAAgAUEEbGohAgJAA0AgACACRg0BIAMgACgCAGohAyAAQQRqIQAMAAsLIAML';
audioWorklet.js
import { base64ToBytesArr, bytesArrToBase64 } from './base64-encode-decode.js';
import { wasm } from './wasm.js';
console.log(wasm);
const asm = base64ToBytesArr(wasm);
const memory = new WebAssembly.Memory({
initial: 10,
maximum: 100,
shared: false,
});
try {
const buffer = new Uint8Array(asm).buffer;
console.log(buffer);
const obj = await WebAssembly.instantiate(buffer, {
js: { mem: memory },
});
console.log(obj);
const i32 = new Uint32Array(memory.buffer);
for (let i = 0; i < 10; i++) {
i32[i] = i;
}
const sum = obj.instance.exports.accumulate(0, 10);
console.log(sum);
} catch (e) {
console.error(e);
}
Alternatively, intercepting an import
request in ServiceWorker
(tested on Chromium 101) - WebAssembly.Memory
is not shared. The code is compiled and instantiated in the AudioWorkletGlobalScope
- without necessarily defining an AudioWorkletProcessor
class or executing registerProcessor()
, until required.
worker.js
import {base64ToBytesArr, bytesArrToBase64} from './base64-encode-decode.js';
self.addEventListener('install', (event) => {
event.waitUntil(self.skipWaiting());
});
self.addEventListener('activate', async (event) => {
event.waitUntil(self.clients.claim());
});
self.addEventListener('message', async (event) => {
console.log(event);
});
self.addEventListener('fetch', async (event) => {
if (
event.request.destination === 'audioworklet' &&
event.request.url.includes('exports.json')
) {
event.respondWith(
(async () => {
const request = await fetch('./test.wav');
const buffer = await request.arrayBuffer();
const json = JSON.stringify([...new Int8Array(buffer)]);
const wasm = bytesArrToBase64(new Uint8Array(await (await fetch('./memory.wasm')).arrayBuffer()));
console.log(wasm);
const data = `{"wav":${json}, "wasm": "${wasm}"}`;
console.log(data);
const blob = new Blob([data], {
type: 'application/json',
});
return new Response(blob);
})()
);
console.log(await clients.matchAll({includeUncontrolled: true}));
}
});
audioWorklet.js
import { base64ToBytesArr, bytesArrToBase64 } from './base64-encode-decode.js';
import json from './exports.json' assert {type: "json"};
console.log(json);
const { wav, wasm } = json;
const asm = base64ToBytesArr(wasm);
console.log(asm);
const memory = new WebAssembly.Memory({
initial: 10,
maximum: 100,
shared: false,
});
try {
const buffer = new Uint8Array(asm).buffer;
const obj = await WebAssembly.instantiate(buffer, {
js: { mem: memory },
});
console.log(obj);
const i32 = new Uint32Array(memory.buffer);
for (let i = 0; i < 10; i++) {
i32[i] = i;
}
const sum = obj.instance.exports.accumulate(0, 10);
console.log(sum);
} catch (e) {
console.error(e);
}
Potential solutions include exposing BroadcastChannel
in AudioWorkletGlobalScope
, where main thread, DecicatedWorkerGlobalScope
, SharedWorkerGlobalScope
, and ServiceWorkerGlobalScope
will all be capable of communicating, including transferables, with AudioWorkletGlobalScope
.
It turns out calling
await WebAssembly.instanitate()
results in AudioWorkletNode
constructor throwing
Failed to construct 'AudioWorkletNode': AudioWorkletNode cannot be created: The node name 'audio-worklet-stream' is not defined in AudioWorkletGlobalScope.
when executed before registerProcessor()
, including when using await WebAssembly.instantiate()
when importing from another file into the audio worklet global scope.
One workaround for that is to use Promise.all()
or Promise.race()
.
const [a, b] = await Promise.all([
(async() => {
const obj = await WebAssembly.instantiate(buffer, {
js: { mem: memory },
});
// ..
})(),
!registerProcessor('audio-worklet-stream', class _ extends AudioWorkletProcessor {
constructor(options) {
super();
}
process(inputs, outputs) {
// do stuff
return true;
}
})
]);
console.log(a, b);
Evidently import
is still not supported in AudioWorkletGlobalScope
on Firefox https://bugzilla.mozilla.org/show_bug.cgi?id=1572644, so the import
/SeviceWorker
workarounds to pre-load data won't work there.
A bit of WebIDL to get things going:
partial interface AudioWorkletGlobalScope {
readonly attribute MessagePort port;
attribute EventHandler onmessage;
};
partial interface AudioWorklet {
readonly attribute MessagePort port;
attribute EventHandler onmessage;
};
This is set-up on AudioContext
construction, always available, but of course, it's only useful if you do a call to addModule
to add a script to the AudioWorkletGlobalScope
. This won't have to be a script that contains a processor, it can be some sort of manager. That said, the port will be "shipped" directly on construction.
What is necessary, though, is to call onmessage
on the AudioWorkletGlobalScope
at a specific point in the graph rendering algorithm. What would probably be the most useful thing would be to dequeue all messages normally, probably intermixed with the events that are targeted to AudioWorkletProcessor
s.
There's lots of prose to write to set this up properly (create the message channel, serialize/deserialize the port, send them, etc.), but it's not overly complex.
So that change would remove the port
in AudioWorkletProcessor
, which would be essentially useless then?
When implementing WebAssembly integration with Web Audio AudioWorklets with Shared Memory, one issue is that AudioWorkletGlobalScope does not seem to have a postMessage/onMessage type of functionality.
As a result, in order to share the WebAssembly application Module and Memory objects with the AudioWorkletGlobalScope, one needs to create a dummy processor and instantiate it on the main thread in order to be able to send these bits of data over.
Unless I have missed something, this looks like a design shortcoming of the spec - and results in one having to abuse the spec in order to achieve the necessary goal.
You can see examples of this abuse in two different implementations of Shared Memory Wasm <-> AudioWorklet integrations:
I wonder if I've missed a critical piece of the spec and there is a simpler way to do this - or did this use case fall through in the spec?
If so, would it be possible to add a postMessage()/onMessage() functionality to AudioWorkletGlobalScope? That would simplify the creation and management of shared memory Wasm Audio Worklets.