Open StephanRoemer opened 3 years ago
That has been on my to-do list for a while, yeah. I spent some effort actually prototyping this with a separate JSFX, and it kindasorta worked but there were some edge cases where it was awfully janky. In other words, it was going to be a challenge to get right. Also it would require some pretty invasive changes to the Reaticulate JSFX's core design in order to support event buffering (store and forward type of behavior).
Since then Reaper has added negative track delay, but when I last played with it, it didn't cope very well with that delay bouncing around (as it would with articulation changes). So I think all the same complexities I was originally facing still apply.
Still this feature is something I definitely want Reaticulate to have. I'll take another stab at it at some point.
Ahh no, I would definitely not change the track delay on the fly. I guess that would be too unstable. I would really only introduce a "nudge notes in the MIDI editor by xxx ticks to the left". All notes between an articualtion change would get nudged. Yes I know, this would move the notes away from the grid, but that's what I'm doing right now to adjust the timing anyway.
Ah, this is definitely easier to implement. It's a bit of a poor man's solution -- dynamic event retiming being the nicest behavior -- but a "nudge notes" action is clearly better than current state of nothing.
And it's also a nice stepping stone into dynamic event retiming. We can get the attribute to the spec and in bank definitions early, with the easy-to-implement "nudge notes based on articulation delay" action initially, and then later a feature enable automatic delay compensation on the track. 👍
Ah, this is definitely easier to implement. It's a bit of a poor man's solution -- dynamic event retiming being the nicest behavior -- but a "nudge notes" action is clearly better than current state of nothing.
Yes, it is a poor man's solution, but given the fact, that no DAW has nailed that feature at all, this is definitely a step in the right direction 😄
And it's also a nice stepping stone into dynamic event retiming. We can get the attribute to the spec and in bank definitions early, with the easy-to-implement "nudge notes based on articulation delay" action initially, and then later a feature enable automatic delay compensation on the track. 👍
Exactly! It would literally be an additional attribute that gets added to the articulation.
Just linking this forum post as a reminder to future me, which contains some initial thinking.
Notably, a proper implementation would need to offer contextual delays, based on velocity range and whether the notes are overlapping (legato).
Ah, love that way of organizing knowledge! 😃 👍
Seems I was thinking on the same wavelength as both of you, glad to see this is under consideration :) https://vi-control.net/community/threads/reaticulate-articulation-management-for-reaper-0-4-6-now-available.66851/post-4774886
I've been giving this a bit of thought and some prototyping last night. I got the basic mechanics working for a track-global delay, but of course ultimately it needs to be much more granular than that.
I'm familiar with some libraries, like CSS, that have different latency characteristics based on note velocity. This is relatively straightforward to handle as there are only 128 velocities I need to map, and double that to account for legato vs non-legato variants, across all 16 source channels. So it's not adding a lot of extra memory usage to the JSFX (an extra 16KB per instance).
Are you guys aware of any libraries that manifest different latencies by current CC value? e.g. if CC1 is above 64, latency is 200ms, below it's 100ms, say.
I can't think of any myself, and I'm rather hoping to avoid implementing it, because that means mapping 128 values for 128 different CCs times two (legato vs non-legato), across 16 channels, which will add another 2MB of RAM per JSFX instance. Things start to add up very quickly.
Hmm nope, can't think of any library really! It's usually based on velocity and I think only Cinematic Studio Series does that. Happy if other can chime in here.
@jtackaberry No, the libraries I know don't do that. In my case the delays differ only between the aticulations 🙂 I'm looking forward to the delay compensation being available in Reaticulate at some point 👍 Keep the good work 🙂
@jtackaberry hello. I think you are overthinking it a lot. Just being able to set a negative delay per articulation would be plenty enough for most cases. Remember also that users have to input all these delays.
Also here's a suggestion for how you can implement this feature:
For example, if articulation A is -120ms, and articulation B is -100ms, then you set a track delay to -120ms, and on playback notes with articulation A they are delayed 0ms, and notes with B are delayed 20ms.
@HanaMcHanaface
I think you are overthinking it a lot.
I graciously accept that criticism, because it's usually true for me. I have a way of getting myself caught up in weeds. But I also think you are underthinking it. :)
Even in your simpler use case where an articulation has a single fixed delay offset, implementation is surprisingly complex and is fraught with tricky edge cases.
Minimally, you need to treat different kinds of MIDI events with different delays. Note-ons of course need to be delay-compensated, but what about note-offs? In practice these should be scheduled in real time (i.e. aligned to the grid per the source MIDI) so the note sounds for as long as is written.
Except of course when the next note-on is from a different articulation that has a greater (by which I mean "more negative") delay offset than the current note, which would require the next note-on to be delivered before the previous note-off's grid position, so in that case the previous note-off does in fact need to be adjusted too in order to preserve the original event ordering, even though in most cases note-offs should be scheduled in realtime.
Then there are performance CCs, like CC1 or CC11, which need to be real time. So far that's two independent event queues: a delay-compensated queue for note-ons, and a realtime queue for CCs, and sometimes-but-not-always note-offs.
What about the MIDI outputs from articulation changes themselves? When you use Reaticulate to insert an articulation, you get a Program Change event which Reaticulate translates to the appropriate MIDI to switch articulations in the VI. Should these be realtime, or delay-compensated?
It turns out that it actually depends, because some libraries use keyswitches to perform transient embellishments/flourishes on the current note. If they're triggered as Program Changes, Reaticulate would consider these articulation changes, but you as a composer would certainly expect these to occur in realtime, not shifted earlier according to the last note-on's delay offset.
But then if you're actually switching articulations in a latching sense, say from spiccato with a -60ms offset to legato with a -150ms offset, it might well not work to send the articulation change MIDI in realtime, if the Program Change in your MIDI item was closer than 150ms to the next note-on. In that case, we need to delay-compensate the output events from the articulation change too. In practice, this ends up being a kind of third queue.
Same-note legato (e.g. rebows) are tricky when you delay-compensate note-ons but not note-offs. Similarly, some libraries auto-engage legato even for disconnected notes as long as the next note starts within a certain time window of the previous note, and if we're delay-compensating note-ons we end up meddling with the time interval between last note-off and next note-on that you see in your MIDI item, such that things can suddenly play legato even when that wasn't intended, and it doesn't play that way without Reaticulate.
I could go on. So many edge cases even with the single delay per articulation you suggested. It turns out that once you address all these scenarios, which you need to do in any case, supporting things like a user-configurable delay based on note velocity is pretty easy.
I've recently been corresponding with the author of the Variable Delay Compensator (VDC), which solves this problem as a KSP multiscript, and they've opened my eyes to even more edge cases and library quirks I hadn't previously considered. They've been helping me tease apart the algorithm used by the VDC, where they've already done the heavy lifting of wrestling with these slippery real-world problems, and I'm working to wrap my head around that and adapt it to Reaticulate.
My hope is the capability will land in the next major release of Reaticulate (0.6), but it really depends on how janky the implementation ends up being. It remains a WIP.
@jtackaberry okay I'm convinced :D Good luck I hope to see it a reality
Variable Delay Compensator (VDC),
Didn't know about this project, very interesting.
@jtackaberry would you consider releasing an alpha version of Reaticulate that has a rudimentary implementation of note offsets? As I understand it, most of the complications arise when notes are either very close or overlapping when switching articulations. However, I would be happy to leave a large gap in the notes for an articulation change (say 1-2 seconds with the articulation change placed on the middle of it) and fill in that gap using the same instrument on a second track. I know that's not an ideal solution, however, I feel that it's significantly better than the alternative of having a dozen tracks per instrument just to cover each articulation (if I don't want to sacrifice note offsets). I really want to use Reaticulate but without having at least some rudimentary/experimental support for timing offsets per articulation, it's kinda hard to justify for me, especially as a beginner composer just trying to put notes down where they look and sound where they should. Does such a workflow with a gap for articulation change (+ a second track to fill the audible gap) seem plausible to you?
@Dillpickleschmidt my composing partner and I have integrated Reaticulate into our main orchestral template to great effect. To be fair, it takes some work to set up properly, but once it's in, it's a godsend. Even without this dynamic offset feature. The obvious solution we generally have is to simply duplicate a track where this presents a problem and split the articulations. Not ideal, but still very workable.
Having said that... @jtackaberry if you ever find a way to move mountains and make this a reality, you would be a GOD! It's one of the features I've been most desperate for since running a template of this kind. And no doubt every computer-based composer on earth would find this one of the biggest time-(and annoyance)-saving things imaginable.
There's a hacky way change delay per articulation using separate channels and the JS: Midi Delay
effect.
I'll use Pacific Strings as a simple example. There's a 80ms delay on all articulations, except on legato sustains it's 180ms.
Delay (ms)
to 100
and Channel
to 2
I'm fairly sure you could leverage this in keyswitchable instruments as well:
Ex: If your marcatos are 60ms, tenutos 40ms and shorts 20ms, use 60ms for the whole track, put tenutos on ch2 and delay it by 20ms, and shorts artic on ch3 and delay it by 40ms, and so on.
@Jason, I think, we had a talk about this in the past: negative note delay depending on the articulation.
A group of composers started to maintain an Excel Sheet with the delay times of different libraries and their articulations: https://docs.google.com/spreadsheets/d/1WP9sobba7OkldNkTiSzXP7r3Pb64IzWQWrLkqdiyRcA/htmlview#
The idea would be to provide that articulation delay in Reaticulate and then have the ability to reposition those notes with an action like: "adjust articulation delay". In the case of a set track delay, this value should be taken into account, too. E.g.: a negative track delay of -100 is set, a staccato articulation needs -130, so the notes are only shifted by -30
But maybe you are already implementing something similar :P