Closed DatanIMU closed 1 year ago
Firstly, as mentioned in the paper, drums are not considered melodic instruments and therefore do not produce chords.
Second, typically in midi processing, the pitch of drum is in range of (128, 256). The pitch of melodic instruments is in range of (0,127). I have never tried change a drum program number to a piano and I can not imagine what would happen. You can check what the music sounds after your modification.
Additionally, if you simply change a drum track to a piano track, there will be a significant gap. The model might become confused by the input, as it expects piano notes instead of the percussion pattern.
It is highly recommended to review the code in track_generation.py to gain a comprehensive understanding of the entire processing pipeline, as this will undoubtedly be helpful.
have done replace instrument by drumset (D. Set) in MUSESCore software. but still get chord error. I want to use drums as condition track, in order to generate a new midi file. I suppose to get the same drum beat. below is the log. Then I change drumset into piano, and use 'p'---->'dlg', python track_generation.py works. is it due to setting wrong drumset ID in midi file? Is it a simple way to check whether MIDI program ID is set correctly.
(CVBR) C:\Users\Mocopi\Downloads\muzic-main\getmusic>python track_generation.py --load_path ./checkpoint.pth --file_path example_data/inference Global seed set to 0 <class 'getmusic.modeling.models.dfm.DFM'> <class 'getmusic.modeling.roformer.diffusion_roformer.DiffusionRFM'> <class 'getmusic.modeling.roformer.roformer_utils.DiffusionRoformerModel'> Load dictionary: 11879 tokens. <class 'getmusic.engine.clip_grad_norm.ClipGradNorm'> Get lr 3e-06 from base lr 3e-06 with none <class 'torch.optim.adamw.AdamW'> <class 'getmusic.engine.lr_scheduler.LinearDecayLRWithWarmup'> {'overall': {'trainable': '86.49M', 'non_trainable': '96.0K', 'total': '86.59M'}, 'rfm': {'trainable': '86.49M', 'non_trainable': '96.0K', 'total': '86.59M'}} self.device 0 inference_cache: global rank 0: prepare solver done! Resume from ./checkpoint.pth example_data/inference\1.mid skip? Select condition tracks ('b' for bass, 'd' for drums, 'g' for guitar, 'l' for lead, 'p' for piano, 's' for strings, 'c' for chords; multiple choices; input any other key to skip):d Select content tracks ('b' for bass, 'd' for drums, 'g' for guitar, 'l' for lead, 'p' for piano, 's' for strings; multiple choices):l chord error
(CVBR) C:\Users\Mocopi\Downloads\muzic-main\getmusic>python track_generation.py --load_path ./checkpoint.pth --file_path example_data/inference Global seed set to 0 <class 'getmusic.modeling.models.dfm.DFM'> <class 'getmusic.modeling.roformer.diffusion_roformer.DiffusionRFM'> <class 'getmusic.modeling.roformer.roformer_utils.DiffusionRoformerModel'> Load dictionary: 11879 tokens. <class 'getmusic.engine.clip_grad_norm.ClipGradNorm'> Get lr 3e-06 from base lr 3e-06 with none <class 'torch.optim.adamw.AdamW'> <class 'getmusic.engine.lr_scheduler.LinearDecayLRWithWarmup'> {'overall': {'trainable': '86.49M', 'non_trainable': '96.0K', 'total': '86.59M'}, 'rfm': {'trainable': '86.49M', 'non_trainable': '96.0K', 'total': '86.59M'}} self.device 0 inference_cache: global rank 0: prepare solver done! Resume from ./checkpoint.pth example_data/inference\1.mid skip? Select condition tracks ('b' for bass, 'd' for drums, 'g' for guitar, 'l' for lead, 'p' for piano, 's' for strings, 'c' for chords; multiple choices; input any other key to skip): Select content tracks ('b' for bass, 'd' for drums, 'g' for guitar, 'l' for lead, 'p' for piano, 's' for strings; multiple choices):l chord error
(CVBR) C:\Users\Mocopi\Downloads\muzic-main\getmusic>