YatingMusic / MuseMorphose

PyTorch implementation of MuseMorphose (published at IEEE/ACM TASLP), a Transformer-based model for music style transfer.
MIT License
174 stars 34 forks source link
music-generation music-style-transfer pytorch transformer variational-autoencoder

MuseMorphose

This repository contains the official implementation of the following paper:

Prerequisites

Preprocessing

# download REMI-pop-1.7K dataset
wget -O remi_dataset.tar.gz https://zenodo.org/record/4782721/files/remi_dataset.tar.gz?download=1
tar xzvf remi_dataset.tar.gz
rm remi_dataset.tar.gz

# compute attributes classes
python3 attributes.py

Training

python3 train.py [config file]

Generation

python3 generate.py [config file] [ckpt path] [output dir] [num pieces] [num samples per piece]

This script will randomly draw the specified # of pieces from the test set.
For each sample of a piece, the rhythmic intensity and polyphonicity will be shifted entirely and randomly by [-3, 3] classes for the model to generate style-transferred music.
You may modify random_shift_attr_cls() in generate.py or write your own function to set the attributes.

Customized Generation (To Be Added)

We welcome the community's suggestions and contributions for an interface on which users may

Citation BibTex

If you find this work helpful and use our code in your research, please kindly cite our paper:

@article{wu2023musemorphose,
    title={{MuseMorphose}: Full-Song and Fine-Grained Piano Music Style Transfer with One {Transformer VAE}},
    author={Shih-Lun Wu and Yi-Hsuan Yang},
    year={2023},
    journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
}