Closed stevenwaterman closed 4 years ago
Hello, first, thank you for an awesome tool! It helped make music generation much easier for me. I am able to load custom Piano MIDI using this current workaround:
Go to official client and load in a MIDI file, logging the network result.
Generate one and get a save file.
Let save
be the savestate file and src
be the JSON from MuseNet’s official client, apply this JavaScript:
save.children[1].track.encoding = src.completions[0].encoding
save.children[1].track.endsAt = src.completions[0].totalTime
save.children[1].track.notes = { bass: [], drums: [], guitar: [], harp: [], piano: src.completions[0].tracks[0].notes, strings: [], winds: [] }
save.children[1].track.audio = 'data:audio/mp3;base64,' + src.completions[0].audioFile.slice(2, -1)
The patched savestate can be loaded into MuseTree.
This will require converting the midi to a musenet encoding.
@MrCheeze made a tool to convert between MIDI and MuseNet encoding here: https://github.com/MrCheeze/musenet-midi
That is amazing and makes this much simpler to implement! I'll do my best to look at this in the next couple of days and update this thread as I'm going. Thanks for all your help!
As an aside, it's great to see people actually getting stuck in and using MuseTree - I'd be fascinated to hear more about your experience with it, and encourage you to open issues for any problems you find!
Thanks! I’m starting a YouTube channel and I thought it’d be great to have a unique background music for each video, but with the same theme, so I created a short MIDI clip and let MuseNet improvise from there. That’s my main use case that uses a custom MIDI.
Here’s an example: https://youtu.be/EP24h7g7UmE
Only roughly 5%~15% of the generations sounded great to me (others sound repetitive or not fitting my taste), so it’s great to have access to each of the output.
So now my tree tends to look like this:
First, I let MuseNet generate 16 ideas (length=540). This sets the theme for the rest of the song. I prefer a longer length because this tends to give more continuity in the piece (with shorter length, continuation of the initial completion tends to sound repetitive).
I pick few interesting one and expand on the idea, using a smaller length (320) to give me more fine-grained control. After a few chains, music ends and I export a MIDI file.
That's really interesting - and amazing to see it in use! I'm working on this issue now, starting by editing musenet-midi to work with musetree. I'll keep this thread updated with the progress :slightly_smiling_face:
Best of luck, let me know if anything's unclear in there... Note that OpenAI themselves have their own midi conversions that work in a fairly different way from mine, no idea if the differences matter in practice.
Just to keep this thread updated, I have finished the rework of musenet-midi and started looking into integrating it with MuseTree. However, that required me to first work on #1 which meant a major rework of the Svelte Stores. That code was a mess so I'm going through it and converting to TypeScript and tidying up while refactoring.
Basically, it's a big job. I haven't forgotten about it and I am working on this, but it might a week or more until you hear from me.
Not much of an update, I'm afraid! I'm still working on this any chance I get, and it's coming along nicely! It has ballooned into a huge update, but lays the ground-work for future features like a built-in MIDI editor. You can keep an eye on my work in the partial-typescript
branch.
The two main changes I've had to make are:
The first one is done, and the second one is working, but it doesn't sound great. On the plus side, I have conversions to and from musenet encoding working, so adding this feature should be relatively simple once all the infrastructure changes are done!
This is now implemented. You can use the import
button to select a MIDI file (or manually specify an encoding)
I am closing this ticket, but feel free to open a new ticket for any issues you discover!
Thanks for your patience - it's been a long journey getting this fixed
Relies on https://github.com/stevenwaterman/musetree/issues/1 Related to https://github.com/stevenwaterman/musetree/issues/7
Allow you to load a custom midi into the app, the same as with the starter samples. This will require converting the midi to a musenet encoding.