magenta / magenta-js

Magenta.js: Music and Art Generation with Machine Learning in the browser
https://magenta.tensorflow.org
Apache License 2.0
1.98k stars 313 forks source link

Properties of notesequences #242

Closed agreyfield91 closed 5 years ago

agreyfield91 commented 5 years ago

Looking at docs, I can't find a way to access the properties of a notesequence, like time signature or bpm. Is there a way to access this, or any other properties? Sorry if this is the wrong place to ask.

Thank you!

notwaldorf commented 5 years ago

I think the most complete/accurate list right now is in the protobuf itself: https://github.com/tensorflow/magenta-js/blob/master/music/src/protobuf/proto.d.ts

agreyfield91 commented 5 years ago

Thank you very much! I tried experimenting around with the properties of a notesequence created from OnsetsAndFrames() audio transcription, but only .totalTime returned a value other than either nothing or an empty array. Are there currently any methods/models to "estimate" some of these properties from a transcribed notesequence?

notwaldorf commented 5 years ago

What kind of properties are you looking for? I think onsets and frames will return a non-quantized sequence, so it should have a startTime/endTime for every note. If you need to quantize it, there’s a helper for that too.

agreyfield91 commented 5 years ago

I would ideally be able to get rhythm information such as qpm or time signature from the transcribed sequence. Would quantizing the sequence allow me to access more information?

adarob commented 5 years ago

We don't actually infer qpm or time signature with onsets and frames since we don't know it in our ground truth data.

adarob commented 5 years ago

Note that there are some off-the-shelf methods for estimating tempo, however. For example: https://librosa.github.io/librosa/generated/librosa.beat.tempo.html