Open mifi opened 4 years ago
https://www.reddit.com/r/VideoEditing/comments/1uum3j/rule_on_cutting_to_a_beat/ https://www.reddit.com/r/musictheory/comments/be09qm/offbeats_backbeats_downbeats_upbeats_and_accented/el2o5y5
"Aesthetically speaking, it's actually better to cut on the downbeat (bass drum), as the edit is less noticeable to your audience."
maybe the user API for this feature could look like this:
editly({
outPath: "./customFabric.mp4",
projectAudio: [{ file: './coolbeats.mp3', alignClips: { beat: [ 1 , 3 ], offset: 0, minDuration: 1000 }, effects: ['normalize'], volume: 0.9 }]
clips: [{
{ layers: [{ type: "fabric", func }] },
{ layers: [{ type: "fabric", func }] },
{ duration: 2, layers: [{ type: "fabric", func }] },
{ layers: [{ type: "fabric", func }] },
{ layers: [{ type: "fabric", func }] },
{ layers: [{ type: "fabric", func }] },
{ layers: [{ type: "fabric", func }] },
}],
}).catch(console.error);
I think for now only using the first alignClips definition... Though it's possible for us to use multiple and merge the output of the beat-detector function
setting alignClips will send the main project audio to identify the distance between beats / tempo and then cut to the nearest beat. if the duration is set explicitly then the next clip after will be extended to (the remainder + the normal time it would have had if the previous clip did not have explicit duration set).
but if the song has irregular beat then this could be complex...
I recommend Moby – Thousand as a benchmark for irregular (but not erratic) and rapid BPM.
Excerpt from Wikipedia:
Thousand" was listed in Guinness World Records for having the fastest tempo in beats-per-minute (BPM) of any released single, peaking at approximately 1,015 BPM
BPM rises slowly throughout the song towards the peak and falls off pretty quickly. This song KICKS in Audiosurf 🙃
If Editly ever produces a good result for it, I'd say it would be pretty convincing.
https://github.com/astrofox-io/astrofox might be interesting.
Specifically the code in https://github.com/astrofox-io/astrofox/tree/master/src/audio using https://www.npmjs.com/package/fourier-transform
google 'music analysis github' https://nbviewer.jupyter.org/github/librosa/librosa/blob/master/examples/LibROSA%20demo.ipynb https://msaf.readthedocs.io/en/latest/features.html#features https://towardsdatascience.com/finding-choruses-in-songs-with-python-a925165f94a8 https://github.com/aubio/aubio