Closed JPagnier closed 2 months ago
Hello Julien, welcome ! Is it the same issue as #10 ? Should we merge them, or did you see this as another Issue ?
As ADM to be agnostic is describing the result instead of transmitting parameters, or specific controller action ("touch" sounds very specific from capacitive fader, touchscreen ?) : please, it would be useful to describe your use case with more details.
What is the purpose of this ? You said "who takes control" :
Indeed, this requires further clarification: I'm not specifically referring to the iPad or touch screens, but rather to the description: I'm touching this value; we're clearly describing what I'm doing. For me, the 'touch' value mainly functions to enable bi-directionality of OSC information, facilitating the relay of information to communicating devices. It is also crucial for determining who writes the automation in the DAW. Of course, this parameter must appear with an object number for all ADM-OSC parameters.
Actually : Meyer-sound use : /Console/obj/xy/touch INT(0/1)
FLUX/ Nuendo / Ovation use : (/source/obj/xyz) STRING(touch) (/source/obj/xyz) FLOAT FLOAT FLOAT (/source/obj/xyz) STRING(release)
It is up to us to decide whether this value should be integrated into the messages or separated :
Exemple for XYZ /adm/obj/()/XYZ/Touch INT(0/1) or (/adm/obj/()/xyz) STRING(touch) (/adm/obj/()/xyz) FLOAT FLOAT FLOA (/adm/obj/()/xyz) STRING(release)
In ADM, there could be already several interactive parts, like changing foreground/background levels, languages or storylines. cf https://adm.ebu.io The renderer don't need to know "how" user change this (touching a button or a fader, of course is a use case, but may be not the only one). It sounds very interesting to document what could be interactions on objects during content productions.
A couple of thoughts on this.
Personally, I find it a bit confusing to have something in the /adm/...
namespace that doesn't translate to an possible message in the ADM standard. It isn't clear what "touch" actually means, although I agree it could be useful.
I would prefer not to have to find the type of the OSC arguments to figure out what the message means. So, differentiating between STRING and FLOAT is an extra layer of sorting that could be taken care of in the address. Something like:
/ctrl/obj/<object number>/touch INT 0/1
would be the most clear for me, as an engineer implementing these things.
Can somebody clarify if touch/release is sent only once when a user touch/release a controller, or if it has to be transmitted along all positional data's ? In the first case, seems dangerous as UDP packets may be lost. In the second case also, as there is no warranty that the messages arrive in the same order as emitted, or that some other messages cannot be received in between.
Why just sending positions when controller is touched, stop sending those when controller is release isn't sufficient feature ?
As there can be several ADM OSC transmitters (typically here I can imagine the DAW playback + controller), how the receiver should knows which one is "touched", by IP filtering ?
Thank you so much to help everybody understand this feature.
Welcome @JPagnier
There's no way to merge issues. I'm closing this with a reference in https://github.com/immersive-audio-live/ADM-OSC/issues/10
Hello everyone, I am Julien and I would like to work with you on organizing this ADM OSC Standard with a multi-brand user perspective.
My first request is to propose the value of Touch with an argument of 0 or 1 to define who takes control over the OSC. Some of you have already proposed this natively, but I would really like this message to appear in the ADM OSC to simplify the looping issues.
What do you think about this ?