tim-janik / anklang

MIDI and Audio Synthesizer and Composer
https://anklang.testbit.eu/
Mozilla Public License 2.0
51 stars 3 forks source link

Negative frame offsets in MidiEvent #26

Closed swesterfeld closed 4 months ago

swesterfeld commented 7 months ago

I don't think we actually want negative frame offsets in the MidiEvent struct. It complicates device implementation (like BlepSynth), and I don't see that there are valid use cases.

tim-janik commented 7 months ago

The one case where it would be "correct" to deliver events with negative frame offsets is live recording of MIDI events if we implemented a near contiguous virtual sample clock. For example:

stamp=256: render() completes stamp=300: NOTE_ON from MIDI keyboard stamp=400: PARAM_CHANGE (from keyboard or GUI) stamp=512: next render() call (**)

In the second render() call, the NOTE_ON offset would be -212 and the PARAM_CHANGE offset -112. However, since we already assign/process all PARAM_CHANGE events with frame=0 and b/c the NOTE_ON during future replays will be delivered at (stamp=256,frame=44) instead of (stamp=512,frame=-212), there is no practical use for negative frames atm.

tim-janik commented 5 months ago

As I wrote previously, the one case where negative offsets could be useful is when realtime events are delivered after they have arrived. The synthesis processors cannot really react to that in any meaningful way, but we might need the exact time stamps for recording. Which means, if recording is implemented at the right level, i.e before the actual timing information is lost, it'd be sufficient to move the timestamp to offset 0 of the next block. We just don't have recording implemented yet, other than that I'd be happy to see a PR that makes the timestamp unsigned.

swesterfeld commented 4 months ago

I've thought about this, and I have an idea here, but first what we want to achieve:

So loosing timing information at some level may not be such a good idea here. So maybe if we have a block size of 256 what we really want to do is adding 256 to all "negative" timestamps. This introduces a delay for midi input which is constant to the worst case (in this case ~5ms). However, this means that if a musician has played two notes that are exactly 9 ms apart, they will also be in the recorded midi. Also during live recording, the audio output will be identical to what we get after playing back the recorded project.

I'm suggesting this because I saw that if I make MidiEvent::frame an uint : 12, at some part the micro-timing information will be discarded, and this causes us to produce "wrong" or "different" output at some point.