WebAudio / web-midi-api

The Web MIDI API, developed by the W3C Audio WG
http://webaudio.github.io/web-midi-api/
Other
321 stars 55 forks source link

MIDI API should be available from Workers? #99

Closed cwilso closed 1 day ago

cwilso commented 10 years ago

It has been suggested to me that the MIDI API should be available from Workers. For the purposes of keeping a sequence going in a reasonable manner, this sounds like a great idea to me.

toyoshim commented 10 years ago

+1

nfroidure commented 10 years ago

+1

notator commented 10 years ago

+1

marcoscaceres commented 10 years ago

For record keeping, can you just list some of the use cases that you envision for having the API running in workers?

nfroidure commented 10 years ago

@marcoscaceres at least this one https://github.com/cwilso/WebMIDIAPIShim/issues/33

igorclark commented 10 years ago

Hi @marcoscaceres, this would be really helpful for my use case. I'm generating MIDI events in Javascript, and GUI/window resize events can really affect the timing.

I've moved the scheduling code into a web worker, which works well; on average, events inside the worker fire within an average of 1.5-1.7ms of their intended time, tested over a period of 5 minutes and more.

However after that, the scheduler thread has to send the outgoing MIDI messages back via the main thread, at which point they can still be interrupted. If the web worker could send the MIDI messages directly, this wouldn't happen, and the timing could be much more reliable.

notator commented 10 years ago

@marcoscaceres @igorclark Hi, my AssistantPerformer (https://github.com/notator/assistant-performer) also generates MIDI events in Javascript, and wants to do that over arbitrarily long periods of time without being disturbed by interruptions in the main thread. So +1 to igorclark -- especially his last paragraph. I'm also planning a polyphonic version of this application. This would be like playing a prepared piano. Obviously, the more integration with workers, the better.

marcoscaceres commented 10 years ago

Thanks everyone! These are really helpful.

jussi-kalliokoski commented 10 years ago

Sounds great to me!

jaycliff commented 10 years ago

This would be really nice :)

joeberkovitz commented 9 years ago

+1 on the feature, for sure. But perhaps this isn't critical for release, given that events in a sequence can be assigned exact timestamps and that the main thread need not schedule events in tight proximity to their physical output.

MidiHax commented 9 years ago

+1 I think @igorclark nailed the use case that all of us developing sequencers have encountered. This seems like a rather essential feature in my opinion.

toyoshim commented 9 years ago

I also think this is an important feature. Once basic part of the spec is fixed, I'll work on this.

cwilso commented 9 years ago

It's sounding like people want this in v1. (It's pretty straightforward to do; in addition to

partial interface Navigator {
    Promise<MIDIAccess> requestMIDIAccess (optional MIDIOptions options);
};

we need

partial interface WorkerNavigator {
    Promise<MIDIAccess> requestMIDIAccess (optional MIDIOptions options);
};

and (I think) that's pretty much it.) Additional cost on implementations, of course.

ryanlaws commented 7 years ago

This is critical for acceptable timing. How can I help move this forward?

toyoshim commented 7 years ago

I understand the importance of using Web MIDI in workers. But as one of chromium committers, I would like to wait for another browser implementation rather than making the spec harder to implement so that we don't fail to standardize Web MIDI API.

In terms of spec, it should be easy as Chris proposed. Could we make the worker support an optional feature for v1 so that any v1 ready browser can start supporting it when it's ready?

JohnWeisz commented 5 years ago

Both the Web MIDI API and the Web Audio API have to be available in DedicatedWorkerGlobalScope for production performance and stability, especially real-time performance (because scheduling is already possible, albeit rather inconvenient).

Until then, it's just a toy/tech demo unfortunately (well, compared to the stability that would be possible with worker support that is, so take this as a pretty fat piece of exaggeration).

hoch commented 5 years ago

I think AudioWorkletGlobalScope might be the better fit for this purpose. Then you can process MIDI data right inside of the audio rendering thread. Although I think deeper coordination is better (like non-blocking read from MIDI message queue in the audio callback), just exposing callback inside of the worklet global scope will help a lot of use cases.

There are already several web-based DAWs are using Audio Worklet as an audio engine driver, and I believe this will be a common pattern.

hoch commented 5 years ago

@JohnWeisz Sadly, several reports from developer suggest Worker is not suitable for audio processing. When thread contention happens, they are the first to suffer. So a system with low core count or a capped thread pool (e.g. mobile) will perform significantly poorly. That's why we want something like Audio Device Client to be able to run audio-specific tasks on a high priority thread.

JohnWeisz commented 5 years ago

@hoch I understand your point, but the main pain point here is the potentially real-time-critical MIDI message processing being done on a thread shared with blocking UI and layout calls.

"I think AudioWorkletGlobalScope might be the better fit for this purpose."

In many cases it's actually a lot better indeed. If subsequent heavy-duty data-processing is needed on MIDI data, it can be transferred to a worker thread directly without main thread interference anyways (e.g. by transferring a worker's MessagePort to the audio worklet directly).

Furthermore, perhaps an even more robust solution would be for each midi device to simply have a transferable MessagePort in the first place, instead of onmidimessage. User code can then do whatever it wants, wherever it wants, simply by transferring this MessagePort (so this would automatically support audio worklet and dedicated worker).

hoch commented 5 years ago

Agreed. I think the ideal setup would actually be:

MIDI-enabled Worker > SharedArrayBuffer > Audio Thread (Audio Worklet or Audio Device Client)

This way no postMessage would be needed at all after the initial setup.

JohnWeisz commented 5 years ago

MIDI-enabled Worker > SharedArrayBuffer > Audio Thread (Audio Worklet or Audio Device Client)

Just for the record, what I meant with a transferable MessagePort (second part of my comment) was something like this (for a theoretical MIDI input):

  1. Request MIDIAccess on main thread (or even worker/worklet, doesn't matter)
  2. Get MIDIInput input
  3. Transfer port property of input to an audio-worklet thread, or a dedicated worker thread, whichever one the user code needs
  4. Listen to MIDI message events using the transferred port's message handler, which dispatches message events for what MIDIInput was dispatching midimessage events previously, with the same data payload

The benefit of this system is that MIDI messages can be listened to on any thread, by simply transferring MIDIInput.port. I think this implies that MIDIInputs/MIDIOutputs get their own thread (each, or 1 total). This approach also doesn't require a SharedArrayBuffer.

hoch commented 5 years ago

I understood what you meant. That seems feasible, but in order to do that MessagePort MUST be capable of handling input object. Simply exposing MIDIAccess to the other Worker or Worklet global scope would be simpler. (The strong tie to navigator object might be a problem, perhaps?)

Using SharedArrayBuffer can be a robust alternative or even a basis for rapid prototyping. An idea with a working prototype always better for making spec progress faster. Just a thought.

JohnWeisz commented 5 years ago

Ultimately, whichever is simpler and gets this thing off the main thread at last.

padenot commented 5 years ago

Agreed. I think the ideal setup would actually be:

MIDI-enabled Worker > SharedArrayBuffer > Audio Thread (Audio Worklet or Audio Device Client)

This way no postMessage would be needed at all after the initial setup.

Yes, let's not pollute the various API by special casing 1:1 relationships, when we can compose things better instead and achieve the same things with the same performances.

7ombie commented 2 years ago

What happened to this? The spec talks about Launchpads and DJ controllers, but you can't build anything serious around MIDI controllers with main-thread latencies. In a pro-audio context, that would be super nasty.

7ombie commented 2 years ago

I think AudioWorkletGlobalScope might be the better fit for this purpose. Then you can process MIDI data right inside of the audio rendering thread.

I don't think the audio thread should be handling MIDI (especially outbound MIDI messages for controlling LEDs etc). It's not what worklets were designed for. Besides, a regular worker can use a shared array buffer (or Wasm memory) to write directly to the memory that the audio thread is rendering from, without extra latency or blocking the audio thread.

7ombie commented 2 years ago

Just nudging this thread, hoping to get a status update. A few noteworthy points...

There's an interesting proposal for exposing regular input (basically keyboard and pointer events) in workers. That proposal contains some use cases and general observations that are relevant to controlling realtime audio.

Chromium et al automatically use high priority threads for audio worklets now, and that has substantially expanded the range of serious audio applications that are possible on the platform. The only major limitation that remains (on Chromium, at least) is the inability to handle low-latency input (WebUSB is threadable, but does not expose (class-compliant) HID or MIDI devices).

In short, if we could (directly) handle keyboard, touch and MIDI events in workers, we will have all of the essential primitives for serious realtime audio programming in the browser. Given that this thread is just about updating the spec, I cannot see any reason to hesitate any longer.

cwilso commented 2 years ago

@hoch and @padenot should weigh in on the feasibility, but I'd hesitate to just put it in the spec without some confidence that it's possible and likely to implement.

hoch commented 2 years ago

In terms of the feasibility, I believe @toyoshim can provide us with the better answer.

I am also aware that FireFox has a working implementation - so If both implementors are positive about the feasibility, the spec change doesn't seem controversial.

cwilso commented 2 years ago

@padenot is Firefox's implementation exposed on Worker?

padenot commented 2 years ago

We implement the spec as it is today, so no.

I don't think it would be particularly hard to do so, though, modulo the permission aspect, that is tied to navigator at the minute.

I do think it would be a good idea, and in fact, necessary, for any software that does anything non-trivial on the main thread.

toyoshim commented 2 years ago

I experimentally created a POC patch to support Web MIDI in dedicated workers. https://chromium-review.googlesource.com/c/chromium/src/+/3439238

We have a few code that depends on information bound to DOMWindow, but we can fix it easily. So, what we really need is just to expose the requestMIDIAccess interface to the WorkerNavigator. It should not be difficult to brash up this CL to be production ready.

7ombie commented 2 years ago

Can we move this forwards, please guys, and get the specification updated? I believe the reasons for holding off on this have been addressed now. It's just the API that needs finalizing (re. navigator). Thanks.

Boscop commented 1 year ago

I'd really appreciate if the Web MIDI API would be accessible from Web Workers. When implementing a MIDI player that needs to be stepped forward at 60 FPS, the only way to do it right now seems to be either setInterval (bad performance & jitter) or requestAnimationFrame (which doesn't work while the window is not focused, so the MIDI playback pauses when the user switches from the browser window to the DAW window). Or did I miss another way? I'm curious if there is one. If I run the midi player in a worker but use message passing to send the midi events to the main thread before sending them out to the midi port, do you think that would work without added latency/jitter or would it be suboptimal to setInterval/requestAnimationFrame?

Anyhow, if the MIDI API was accessible from web workers, a loop could be run in a worker thread for the MIDI playback.

JSmithOner commented 1 year ago

any updates on this? is this that hard to implement.I'm using exhaustively webmidi in my app and the dom is drastically slowing down some processes. Thanks in advance

hoch commented 1 year ago

I believe everyone in the WG understood and agreed with the need of this change, but the group needs to come up with the spec text first. Sorry for the delay, but I am cautiously looking at 2023.

7ombie commented 1 year ago

Thanks, @hoch. I appreciate the update. It's nice to know things are moving along, even if it'll take a while.

If this spec is committed to threaded MIDI, then the Input for Workers Proposal should drop their (stalled) effort to do the same thing. IMO, the scope of that proposal should have always been limited to keyboard and pointer events (which are tied to the DOM), and never addressed MIDI events (which have always belonged in the MIDI spec).

I'm not sure if the Input for Workers proposal is actively maintained, but if so, somebody should let them know to remove MIDI events.

I forgot that I tried to contact the Input for Workers people six months ago, just asking for a sign of life, and got nowhere. I've updated them anyway.

7ombie commented 1 year ago

As mentioned briefly before, while I'd like to see MIDI in workers, I don't personally think MIDI should be exposed to audio worklets.

Worklets are very specialized. I don't think it's appropriate to handle MIDI on the browser's audio thread.

Note: Apps that are sensitive to latency can implement their MIDI interface in a regular worker, then use a shared array buffer to write directly to the memory that the audio worklet is rendering from.

I'm not certain, but I don't think any of the other input APIs (WebHID, WebUSB, WebBluetooth etc) are planning to make their APIs available in worklets (of any kind).

JohnWeisz commented 1 year ago

Note: Apps that are sensitive to latency can implement their MIDI interface in a regular worker, then use a shared array buffer to write directly to the memory that the audio worklet is rendering from.

Or a MessageChannel to communicate directly between the MIDI worker thread and the audio worklet/rendering thread, avoiding the main thread.

padenot commented 1 year ago

There are no plans to expose MIDI to Audio Worklets. I also don't know of other APIs being exposed in Worklets.

Communicating from a regular Web Worker to an AudioWorkletGlobaleScope using SharedArrayBuffer, via an SPSC wait-free queue, is the way to go.

postMessage(...); will cause problems, this blog post explains why, with benchmarks.

mjwilson-google commented 9 months ago

Audio Working Group 2023-10-05 meeting conclusions:

mjwilson-google commented 8 months ago

I wrote "WorkletGlobalScope" but it looks like we will expose this in WorkerGlobalScope; please see discussion in #256 if interested.