jonlab / NewAtlantis

http://www.newatlantis.world/
9 stars 6 forks source link

Audio synthesis components #24

Open jonlab opened 8 years ago

jonlab commented 8 years ago

What do we need ? noise, additive synthesis, fm, ... Do we guarantee a deterministic synthesis (we only synchronize parameters) or do we synchronize once the audio is generated ?

petesinc commented 8 years ago

I think it is fine to only synchronise parameters. The other thing which seems important to me is a controllable envelope (but maybe this is another entry?).

jonlab commented 8 years ago

Another question is the level of access of user at runtime. Do we want the user to be able to instantiate and change parameters at runtime (during "play mode") or is this something that must be designed in Unity and exported (and not controllable at runtime) ?

rolandcahen commented 8 years ago

At the moment, I would advise not to add real-time audio control unless it has been preset in Unity. I suppose this feature will be excepted sometimes, but maybe later.

Roland

Le 9 déc. 2015 à 19:17, Jonathan Tanant notifications@github.com a écrit :

Another question is the level of access of user at runtime. Do we want the user to be able to instantiate and change parameters at runtime (during "play mode") or is this something that must be designed in Unity and exported (and not controllable at runtime) ?

— Reply to this email directly or view it on GitHub https://github.com/jonlab/NewAtlantis/issues/24#issuecomment-163347140.

jonlab commented 8 years ago

Maybe this will be clearer with an example : Let's say that we have an oscillator. This is a component (NAAudioSynthOscillator) with the following parameters (this is what I am talking about) : -duration (s) -frequency (Hz) -waveform (sin, cos, square, triangle, sawtooth)

A user is able to create an objet IN UNITY and fill in the parameters (let's say that he wants a 440 Hz 10 s sinus). He can then turn this object into a New Atlantis object and put it in a space. possibly several instances of the same object. But without an access to the parameters at runtime, he will not be able to change to something else than a 440 Hz 10 s sinus). We would have actually 2 things to take care of : -serialization of parameters to database (the objects are persistant and for now the only parameters that we store to the database are the position and rotation of the object. And that's it). -(graphical) interface : how do you change the parameters ? (buttons, 3D, 2D, text commands, ... ?)

jonlab commented 8 years ago

Maybe we can start with something simple close to what was implemented with the Audio Trunk : a small self-contained component with a simple GUI renderered in the scene : I did a small test with the NAAudioSynthOscillator :

capture d ecran 2015-12-10 a 18 42 50

This script can be used in 2 different ways :

  1. Create an object in Unity, build a New Atlantis Asset and put it in a space. User could be able to change the parameters at runtime.
  2. Declare an Object Spawner tool in the Viewer that would be able to create this kind of objects at runtime. So a visitor may be able to instantiate a few oscillators and play with them during a performance.

Pushing things further, we could for example integrate a Csound-like sound synthesis script parsing...

And, we could do GUI toggling with a special tool that would reveal the GUI (maybe we don't want all the GUIs to be always active at the same time).

rolandcahen commented 8 years ago

Enveloppe: ASR [ % % %]

jonlab commented 8 years ago

The Audio synthesis components still needs to be attached to an Audio Source and any Audio Source triggering component can be used. So you can make a 440 Hz sinus composite object by attaching : -an AudioSource properly configured (Rolloff, volume...). -an NAAudioSynthOscillator properly configured. -an NAPlayOnCollide to make the AudioSource play on a collision.