Open codethisio opened 2 months ago
Thanks! I guess the example has to be adjusted to init Web Audio on click.
Doesn't seem to work, 1.68.131 Chromium: 127.0.6533.73 (Official Build) (64-bit). Tried to host this. https://wavesurfer.xyz/examples/?webaudio.js should be flagged as having issues so that the time of users is not wasted.
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>wavesurfer.js examples</title>
<style>
html, body {
margin: 0;
padding: 0.5rem;
font-family: sans-serif;
background-color: transparent;
font-size: 16px;
}
input {
vertical-align: text-top;
}
@media (prefers-color-scheme: dark) {
body {
background: #333;
color: #eee;
}
a {
color: #fff;
}
}
</style>
</head><body><script type="module" data-type="module">
// Web Audio example
import WaveSurfer from 'https://cdn.jsdelivr.net/npm/wavesurfer.js@7.8/dist/wavesurfer.esm.js'
// Create your own media element
const audio = new Audio()
audio.controls = true
audio.src = 'https://wavesurfer.xyz/wavesurfer-code/examples/audio/audio.wav'
// Create a WaveSurfer instance and pass the media element
const wavesurfer = WaveSurfer.create({
container: document.body,
waveColor: 'rgb(200, 0, 200)',
progressColor: 'rgb(100, 0, 100)',
media: audio, // <- this is the important part
})
// Optionally, add the audio to the page to see the controls
document.body.appendChild(audio)
// Now, create a Web Audio equalizer
// Create Web Audio context
const audioContext = new AudioContext()
// Define the equalizer bands
const eqBands = [32, 64, 125, 250, 500, 1000, 2000, 4000, 8000, 16000]
// Create a biquad filter for each band
const filters = eqBands.map((band) => {
const filter = audioContext.createBiquadFilter()
filter.type = band <= 32 ? 'lowshelf' : band >= 16000 ? 'highshelf' : 'peaking'
filter.gain.value = Math.random() * 40 - 20
filter.Q.value = 1 // resonance
filter.frequency.value = band // the cut-off frequency
return filter
})
// Connect the audio to the equalizer
audio.addEventListener(
'canplay',
() => {
// Create a MediaElementSourceNode from the audio element
const mediaNode = audioContext.createMediaElementSource(audio)
// Connect the filters and media node sequentially
const equalizer = filters.reduce((prev, curr) => {
prev.connect(curr)
return curr
}, mediaNode)
// Connect the filters to the audio output
equalizer.connect(audioContext.destination)
},
{ once: true },
)
// Create a vertical slider for each band
const container = document.createElement('p')
filters.forEach((filter) => {
const slider = document.createElement('input')
slider.type = 'range'
slider.orient = 'vertical'
slider.style.appearance = 'slider-vertical'
slider.style.width = '8%'
slider.min = -40
slider.max = 40
slider.value = filter.gain.value
slider.step = 0.1
slider.oninput = (e) => (filter.gain.value = e.target.value)
container.appendChild(slider)
})
document.body.appendChild(container)
</script></body></html>
I am attempting to create gain controls for each audio channel in my Wavesurfer implementation. I noticed an issue where Chrome does not seem to like manipulating an audio element prior to the UI being "interactive" by the user.
When testing the Web Audio example (https://wavesurfer.xyz/examples/?webaudio.js) in Chrome 127.0.6533.100, the Play/Pause button on the audio element are non-functional. This is consistent with my own app as well.
Everything works correctly in Firefox.
I need to preload the Wavesurfer instance to display the waveform but the UI is technically not interactive yet. However, since the Wavesurfer binding occurs on load of the view, the audio element bombs out (presumably) due to the Chrome autoplay policy.