radioprotector / edenator

A browser-based music visualizer using React and three.js
https://radioprotector.github.io/edenator/
MIT License
2 stars 0 forks source link

Edenator

Edenator is a browser-based music visualizer, implemented using the following libraries:

This project is written in TypeScript and makes use of the Hooks API. All primary components use the functional component style. It was bootstrapped with Create React App.

The Zustand-based store is primarily used to keep track of the current song being played and the theme in use for the visualization.

Visualizing Songs

The overall App component is responsible for application scaffolding, file and theme selection, and audio playback.

The Visualizer component is responsible for configuring the Canvas that contains all other visualization elements. These child components include:

Common Techniques

The following processes are common in these visualizations:

Theming

The Theme interface describes relevant information for theming a track visualization. There are numerous color-based themes, most of which have been generated using a handful of "seed" colors provided to the generateThemeForColor helper method.

When a theme is selected, it is tracked in the Zustand store and components will pull the relevant information from that centralized store.

For HTML elements that are influenced by the theme, we use the following approach:

The AppStyles component is also responsible for dynamically updating the page's <meta name="theme-color"> tag and application icon based on the theme.

The ThemeReviewer component is only included in development builds and adds a "Review themes" button that will enumerate the themes and the colors they use in a tabular format for easier comparison, contrast, and previewing of resulting colors.

Song Analysis

While the Web Audio API's AnalyserNode can be used to provide many real-time visualizations, it does not allow for "looking ahead" into the progression of the song. As a result, to have a beat approaching from afar (before it has been played) and only encountering the camera while it is being played requires analysis of the track ahead of time.

As a result, when a song is selected, it goes through an analysis process which includes:

This process ultimately produces a TrackAnalysis object which is used throughout the visualization system.

Keying

When a key is detected, values are normalized to Open Key Notation whenever possible. These values are mapped to specific color themes when available.

No automated key detection is currently supported. When no key has been provided, the song is assigned a random color theme.

Peak and BPM Detection

While peak detection will use absolute and relative thresholds for detection, the analyzer will attempt to determine a cutoff point if there are more peaks identified than expected. This is determined by a maximum number of expected peaks per minute, multiplying that by the song duration, and attempting to dynamically trim the quietest peaks to achieve that target number. Peaks are detected in the following ranges:

Range Min Hz Max Hz Peaks/Min
Sub-bass 20 50 60
Bass 50 90 120
Beat 90 200 300
Treble 2048 n/a 120

In the event a BPM value was not found in the tags, the system will attempt to determine one based on the Beat peaks detected above. Automatic detection will tend to produce results in the 90-180 BPM range, and to date appears to be slightly higher than the "normal" BPM of a song. This may be due to the fact that the beat range's parameters are somewhat more oriented towards a visually appealing display rather than pure BPM detection.

Lull Detection

Lull detection is performed after the main peak analyses have been performed. Lulls are primarily intended to add further visual detail to the times when fewer peaks are represented on-screen.

The detection process sets a certain number of maximum lulls per minute and a minimum duration (tied to measure length) for each lull. The scanning process will keep track of the longest lulls identified and will discard shorter lulls in preference to longer lulls if the limit has already been reached.

Lulls are first scanned against the beat and treble peaks. If, after scanning, the limit has not been hit, lulls are then scanned against the sub-bass peaks and the already-detected lulls.

As a last resort, the longest lulls will be continually subdivided until the limit has been hit or the longest lull in the collection is less than twice that of the shortest lull in the collection.

Track Hashing and Randomness

Instances of TrackAnalysis contain a trackHash field, which is a simple hash to try and uniquely identify a particular file. This is used by the getTrackRandomInt, getTrackSeededRandomInt, and getTrackSeededRandomFloat methods to get "random" results that are ultimately predictable based on the track hash. The goal is to have presentation elements display in a randomized fashion that is ultimately deterministic for the file, so that repeated visualizations of the song will display as consistently as possible.