Integration of synthesis modules processing MIDI is very complicated (#30 #102) and we have a number of issues (#7 #26) because we use multiple threads for:
Midi Sequencing,
Midi note processing and voice allocation,
Audio synthesis.
Integration of these components should be rewritten so that:
a) Midi processing should ocour in the synthesis engine at the end of each synthesis cycle (when the engine is most idle), possibly in parallel by transforming MIDI notes stored in parts into actual note-on/off/etc events for the next engine block.
b) Synthesis modules need to be able to process MIDI input/output channels, in addition to the current audio channels, so event processing and voice allocation can be handled efficiently inside a module instead of through complex side-loaded logic.
Integration of synthesis modules processing MIDI is very complicated (#30 #102) and we have a number of issues (#7 #26) because we use multiple threads for:
Integration of these components should be rewritten so that: a) Midi processing should ocour in the synthesis engine at the end of each synthesis cycle (when the engine is most idle), possibly in parallel by transforming MIDI notes stored in parts into actual note-on/off/etc events for the next engine block. b) Synthesis modules need to be able to process MIDI input/output channels, in addition to the current audio channels, so event processing and voice allocation can be handled efficiently inside a module instead of through complex side-loaded logic.
See also: https://github.com/tim-janik/beast/projects/3#column-5205616