zkmkarlsruhe / ofxTensorFlow2

TensorFlow 2 AI/ML library wrapper for openFrameworks
Other
109 stars 16 forks source link

music generation example #28

Open Jonathhhan opened 1 year ago

Jonathhhan commented 1 year ago

i tried to use this model for music generation: https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/audio/music_generation.ipynb#scrollTo=1mil8ZyJNe1w

its very rough yet, and not sure if everything works as expected. maybe you can find something wrong... especially my replacement for tf.random.categorical(pitch_logits, num_samples=1) could be wrong. the model is very small, so it is included in the example... https://github.com/Jonathhhan/ofxTensorFlow2/tree/music_generation_example/example_music_generation

Jonathhhan commented 1 year ago

I guess it works now...

danomatika commented 1 year ago

Without trying it, I would say we could connect it to a synth with ofxMidi or generate audio directly with ofxPd.

Jonathhhan commented 1 year ago

@danomatika yeah, I already connected it with midi out from ofxMidi. ofxPd would be nice, too. Often it sounds quite random, but I can also hear some musical structure. Have to play a bit with the training settings...

danomatika commented 1 year ago

Looks like you need to track the duration for each pitch and send the noteoff individually. That might give you better rhythm and structure on the output.

Jonathhhan commented 1 year ago

It is interesting, that the output seems to get more structure after running for some time. But maybe that is, because I choose a more or less random sequence for the beginning...

Jonathhhan commented 1 year ago

I added velocity to the model (and added the edited notebook to the example): https://github.com/Jonathhhan/ofxTensorFlow2/tree/music_generation_example/example_music_generation_2 And actually wonder, how to implement the music transformer: https://colab.research.google.com/notebooks/magenta/piano_transformer/piano_transformer.ipynb