Closed Ranoller closed 6 years ago
I have only just started to do some serious audio work in Godot 3 for the first time this week. As you note in issue 4, I also was surprised to find that there is no visual feeback on the bus meters. I thought I was doing something wrong , as I had expected to see feedback while the game was previewing, ala DAW metering.
As for Issue 1, do you mean sending to buses in the left channel or visually to the left (as in the order they are laid out)? I was doing several sends and had no problem. I was sending several individual buses to 2 main buses ("sound effects" and "music") and then sending those to the master with no issue.
A few other general audio observations...
There is no simple way to preview a sample, which is a strange omission. There needs to be a preview/play button on the sample player and in the file system browser. You can toggle on/off the "Playing" checkbox in the audiosampleplayer but this doesn't work a lot of the time, even for the same sample used on different sample players.
No way to adjust pitch in code or the gui as far as I can tell. Though I have been using the random pitch functionality which is useful.
There is a great need for a custom attenuation curve with the AudioSamplePlayer3D. I had a lot of trouble trying to balance near/far sounds and had to resort to individual buses for problematic samples to make use of heavy bus compression (which is not actually a bad workaround, but not ideal). Not that I have touched Unity in a long while, but I had a look at their audio documentation and really like how attenuation and other parameters can be mapped to a customisable curve! Btw, what does Unit Size in Godot even do for attenuation, as I couldn't get much joy from playing with it? Also not sure why the Max db can go louder than the Unit db? Doesn't the attenuation fall off from the Unit db, so is the Max db overriding the Unit db?
I am hoping that someone will be able to eventually add a convolution reverb, as you really need to be able to simulate realistic spaces with an impulse to immerse the player. Not sure what the license is on the old "freeverb" plugin code, but I know that was used in a lot of freeware. Could make a good replacement for the exisitng reverb for a basic synthetic reverb, but if the performance is good enough on a convolution reverb, not sure if anything else is really needed?
when selecting a bus to sidechain in the compressor, the bus selection doesn't stay and goes blank. Hard to say if this is just a gui error or not?
Bus names are truncated on the audio view, so would be good to have a way to be able view the full name. The bus strips are quite narrow, so it wouldn't hurt to have them wider or manually adjustable.
In general though, the audio side is working pretty well in game, so it is a solid base to work with :)
With sending to the left i mean that is not possible to send bus 4 to bus 5.
Note: you can rearrange the buses by draggin now, but the pick way is a bit tricky.
Convolution reverb will be great improvement but there are very good algorithmic reverbs to. Problem i see is that in game play cpu cycles are more valuate that memory and doing real time algorithmic reverb seems worst aproach than using convolution... But i don't think that there will be a priority. Real time mixer is the needed improvement to call a mixer "mixer".
Aha, you are right, you can drag buses...I was trying to drag from the wrong spot! Will strike that from my list :)
So you mean sending from a bus on the left to a bus on the right of it. Haven't tried it, sounds like it is a bug if that doesn't work. I have only been going the other way around.
A good algorithmic reverb is definitely a good thing to have, depends on your game of course. That is why I was suggesting using the freeverb code, as it is quite decent. Note sure of the license though. Personally I would only use an algo reverb on music(I personally love the Valhalla ones, but getting an open source one on par with those might be a hard ask!) and would prefer convolution for a 3d game...but choices are always good!
In the end, it is a matter of an interested coder having the experience and interest to donate their time. DSP programming is very specialised, and as far as I know , there are not a lot of experienced coders doing this kind of work in the open source world. Getting a convolution reverb happening would be a great GSOC idea if any of them are reading this... maybe @akien-mga could add it to the list if it isn't too late :)
In total agreement with the realtime mixer metering, it is a priority.
Reduz is a musician too so probably audio will be a priority (we want the mod files returning man! Chiptunes are crying in their room). In the past i play with the creation of vst (synthedit and others, i try with reascript too, very afordable but only for reaper). C++ part was very intimidating to me, but reading godot code i learned a lot... Maybe in the future i feel confident to start a module to add some kind of plugin... If this not distracting to game dessign. Sometimes i feel with godot that i do all kind of things (learn c++, bug-catch, translate,do test...) but not dessigning my game... :( ;) there is no time for all!!
@Ranoller yeah, I am also a Reaper user :) It is amazing how all the js Reaper plugins have their code so easily accessible...an amazing learning resource for anyone wanting to code audio plugins. Like you, my plate is very full but I would love to learn some basic DSP audio coding one day.
I am hoping that something like Fluidysynth will be added as a module in Godot eventually. It supports soundfonts and Midi and has been used in a ton of open source projects. This would be useful than MOD support for me, as I have never gotten along with trackers :) I released a little Android game several years ago made with the FOSS Zgame editor and it has a basic little 2 osc synth builtin. Was fun to just import MIDI files and use that for all the music and SFX in the game and not worry anout any samples at all. Would be cool to have something (but of higher quality) like that as a module for Godot!
Send in mixer representation only allow you to send audio to left buses of the selected bus. This seems like a bug.
Pretty sure this was done on purpose to avoid loops.
Issue 1: Send in mixer representation only allow you to send audio to left buses of the selected bus. This seems like a bug. (Is a bug?)
I think this is an expected behavior, I not sure how it was implemented, but this is due to the index of the buses, you can expect, for instance, that two different buses send their data to the Master, but not the other way around, since Master would receive data from itself, and then send, and then receive and send... And this is true for any other bus. Any arrange where a bus can send data to buses in index greater then the index of the said bus, will create incoherent mixing (like PlayerSFX and InterfaceSFX sending data to SFX, but the last one can send data to PlayerSFX which sends data to SFX).
There is no simple way to preview a sample, which is a strange omission. There needs to be a preview/play button on the sample player and in the file system browser. You can toggle on/off the "Playing" checkbox in the audiosampleplayer but this doesn't work a lot of the time, even for the same sample used on different sample players.
For some reason they took off the SampleLibrary class, which was an awesome implementation in 2.x which allowed you to preview samples and keep them all in a container, so that you could just add this container to a SamplePlayer and then you could pass a string, just like we do in AnimationPlay
to tell which sample shall be played. Would solve lots of the problems I see in this issue.
And would ease the current task we have to map samples to either an array or a dictionary to dynamically change them. Instead, in SampleLibrary implementation, we just needed to tell SamplePlayer.play(String sample)
I also wanna use Godot as a audiovisual art installation / live projection tool. I been toying with the idea but I need a few things like Visual Scripting (which is now implemented) and OSC and ability to expose audio stream to script. This way I can automate visuals based on audio.
Essentially I wanna do something like Quartz Composer. Have Godot editor on one screen and have the running live project on the projection side with live sync turned on so I can control the projection right from script.
More than just a music making tool. Literally an audiovisual live programming creation tool.
Fixed the issues discussed here. Some things will not change, and sending buses to the left is done by design. You should do fine with this, there is no point in supporting a full graph due to the complexities involved being out of the scope of a game engine.
Seeing some feedback for the running game on the audio buses makes sense, I would open a separate issue about this.
I do some search, not much about audio, don´t find any of the considerations.
Godot version:
3.1 dev build from master
I want to expose some of the issues and consideration that working with audio I found last weeks:
Filters:
HighPassFilter - LowPassFilter - BandPassFilter
Issue: Gain parameter is not needed and doesn´t have effect.
Reverb:
Issue 1: Damping - Spread descriptions are inverted (Spread is widens/narrow stereo and Damping "reflexiveness")
Issue 2: Damping value is inverted 0 is full damping and 1 is no damping (this is compatibility breaker)
Issue 3: Default values are prone to distortion, (and with autonormalized waveforms in import is worst):
Default values:
Room Size 0.8
Damping 0.5
Spread 1
Hipass 0
Dry 1
Wet 0.5
I propose this values by default:
Room Size 0.8
Damping 0.5
Spread 1
Hipass 0.15 (This is important, you can break the speakers of all of this electronic musicians allow to reverberate 50 hz.... :):):) )
Dry 0.7
Wet 0.5
I think default values are important to an effect like this, there are the first impression, is easy to change and not compatibility breaker.
StereoEnhace:
Time Pullout Ms property seems to bump out of phase the right channel, it will be great to have the hability to choose what channel do you want delay.
AudioBusLayout:
Issue 1: Send in mixer representation only allow you to send audio to left buses of the selected bus. This seems like a bug. (Is a bug?)
Issue 2: Visual representation of fader is not accurate. -6db are not the -6db in the png image, there should be any form of insert manually values.
Issue 3 related to 2: -90 db representation is excesive to game purposes. Most time are just needed to adjust -6db -12db -18db , and with that scale is very difficult (With longer faders can be feasible, but better is to put manually a number)
The last two issues are related to the problem that the audio bus resource is not editable in inspector.
Issue 4: Next is not trivial but... We need a mixer in editor if you can´t use that in the previewed game?. Finaly you have to do your own visual representation of the mixer inside the game to adjust values hearing the sound in real time... Mixer without real time edition feels a bit like a "fake" mixer.
Thanks to finish text.... Any opinions?