Open p-himik opened 1 year ago
I don't think I can create an audio buffer without an audioContext, but maybe I can create a buffer with a suspended one. I haven't tried.
At the very least, there should be a way to load all the soundfont files before a user interaction.
But yeah - the browsers want a user interaction.
Pretty sure you can create an audio buffer with a suspended audio context - I've checked it in Chrome, Firefox, and Safari.
My use-case involves creating and showing a waveform of the sound that ABCJS produces from a particular ABC sequence.
The promise returned by
CreateSynth.prime
is resolved only when the audio context is resumed. Modern browsers (at least Chrome but AFAIK others do it as well) prevent audio contexts from starting prior to any user interaction, which makes it impossible to reliably use audio buffers attached to an instance ofCreateSynth
before a user has a chance to interact with the web page.I don't really know why
CreateSynth
cares about the state of the ABCJS-global audio context at all, but if that's important, it's possible to resolve this issue in a non-breaking manner by separating theprime
method into two - one that returns a promise that's resolved when all notes are placed and the other namedprime
that uses the former and only adds the audio context stuff on top of that promise. As an alternative, perhaps turning ABC into an audio buffer could be extracted into its own exported function somewhere.