Closed quasilyte closed 1 year ago
Allow me to clarify a few points:
Do you want to change the volume of each channel when using the Synthesizer object? Or, are you looking to change the volume of each channel individually when playing a MIDI file?
You mentioned that MasterVolume has no effect. Do you mean that changing this value didn't cause any changes to the overall volume?
(Please keep in mind that I don't know MIDI that well.)
Do you want to change the volume of each channel when using the Synthesizer object? Or, are you looking to change the volume of each channel individually when playing a MIDI file?
I'm using only a Synthesizer
type right now since I want to generate the music. So I'm not sure if that's related (there is no MIDI file I guess?) Or maybe I should create a MIDI file on the fly?
I looked at the midi player code and used almost the same approach: playing the notes when they need to be played while rendering the blocks of left&right along the way.
You mentioned that MasterVolume has no effect. Do you mean that changing this value didn't cause any changes to the overall volume?
I tried to set it to 1.0
(0.5
is the default, right?) as well as to something like 0.05
, but there was no difference in the sound. It stops playing anything at some threshold though (it goes silent).
We can consider the game I'm creating to be a synthesizer game. You put some notes on the screen, press "play" and then you want to hear the result.
My current issues are:
If you are directly using the Synthesizer
type, the easiest way to change the volume of a note is to set a small value for the velocity
argument in the NoteOn
function. Velocity can take a value between 1 and 127, with smaller values resulting in softer sounds.
A single synthesizer can only handle one SoundFont. However, a single SoundFont can contain multiple instruments. For example, TimGM6mb.sf2 contains over 100 different instruments. To change the instrument assignment for each channel, please refer to the example code in C#.
It's unusual that changing the MasterVolume doesn't have any effect. Are you performing any process to normalize the volume after rendering the waveform?
I'll use the advice you gave and dig up the ProcessMidiMessage
function. :)
(Also, thank you a lot for responding! :heart:)
It's unusual that changing the MasterVolume doesn't have any effect. Are you performing any process to normalize the volume after rendering the waveform?
It's the same with a chord example from this repository. If I set the master volume to 0.01
, the output sounds the same.
go run . --sf2 *.sf2 -midi chord
ffmpeg -f s16le -ar 44.1k -ac 2 -i out.pcm file.wav
Maybe the sf2 file is to blame?
In the code example in main.go, the volume of the rendered waveform is normalized. Specifically, the process ensures that the maximum value of the rendered waveform matches 99% of the maximum value of 16-bit PCM.
Ah, gotcha. I just copy/pasted the code without knowing what exactly it was doing. I assumed it just encoded the data to pcm. :sweat_smile:
I think this issue can be closed now. 🙌
The answer to my question was found in your ProcessMidiMessage()
suggestion.
So, in order to change the channel volume, it looks like we can do this:
// 0xB0 - multi-byte command prefix
// 0x07 - volume change subcommand (coarse)
// The coarse volume is 0 for "silent" and 127 at the top.
channel := int32(0)
volume := int32(127)
synthesizer.ProcessMidiMessage(channel, 0xB0, 0x07, volume)
Note for further readers: you need to remove the volume normalization from the example's code as described above by sinshu. Otherwise, you'll get the ~same volume level regardless of this setting.
I like this library a lot. :D
I want to use this package in my Ebitengine game.
Right now I can't figure out how to adjust the channel/instrument volume level.
Is it possible?
I tried the
MasterVolume
option but it doesn't look like it's affecting anything.