bbc / peaks.js

JavaScript UI component for interacting with audio waveforms
https://waveform.prototyping.bbc.co.uk
GNU Lesser General Public License v3.0
3.16k stars 275 forks source link

feature request: spectral waveform #528

Open realies opened 1 month ago

realies commented 1 month ago

Colouring the audio waveform by using the spectral centroid of a sequence of short-interval samples to map frequencies to colors can enhance content navigation and aid in understanding how the timbre and intensity of the sound change over time.

I've taken a look at places like:

However, I still have not been able to get a working demo.

I decided to post this feature request here as no apparent web library supports a visualisation mode like this.

I think Native Instruments Traktor provides a great example of displaying a song's waveform. The main view here shows how the timbre changes from predominantly high-frequency content (mapped to blue hues) to mid-frequencies (mapped to green hues) and eventually to bass and sub-bass frequencies (mapped to purple and red hues). I've also attached a few zoomed-in variants of the same playhead position.

chrisn commented 1 month ago

When you say you haven't been able to get a working demo, is that because those projects don't produce the effect you want, or is something else preventing them working?

Adding this would require some substantial changes to Peaks.js. Currently the waveform is just a solid filled (or gradient) colour, so the waveform rendering code would need reworking. Then we'd need a new waveform data format that can store frequency information, and a mapping from that to colours.

realies commented 1 month ago

When you say you haven't been able to get a working demo, is that because those projects don't produce the effect you want, or is something else preventing them working?

Mainly because of basic implementation issues such as drawing on the canvas and computing the correct centroid values per audio segment. There will probably be a need for segments smoothing and varying the amount of segments rendered based on the zoom level. Not giving up just yet.

Adding this would require some substantial changes to Peaks.js. Currently the waveform is just a solid filled (or gradient) colour, so the waveform rendering code would need reworking. Then we'd need a new waveform data format that can store frequency information, and a mapping from that to colours.

Does that mean Peaks.js is probably not the best place to implement this because of the major restructure? I assume if a frontend algorithm is able to draw the frequency data without a huge compute impact, it could be ported to audiowaveform with the new data structure. The frequency ranges to colour hues mapping sounds somewhat trivial in the overall context.

chrisn commented 1 month ago

I think to start with, I'd want to have audiowaveform produce the data, and Peaks.js just does rendering. If we need the computation to also be done in the web front end, waveform-data.js would be the place to implement that. Changing the rendering in Peaks.js is doable but means rewriting that part of the code, so it's not really a major restructure.