monocasual / giada

Your Hardcore Loop Machine.
https://www.giadamusic.com
GNU General Public License v3.0
1.73k stars 98 forks source link

MIDI overhaul #375

Open tomek-szczesny opened 4 years ago

tomek-szczesny commented 4 years ago

There are some issues with current MIDI low-level stuff that will backfire if not taken care of.

Namely, it is built around an assumption that incoming MIDI messages are 3 bytes long. As far as I can tell, longer messages will be truncated, and shorter are just assumed to not exist.

Working on a midiClock module I noticed that longer messages (like MTC full frame) are sent out in a few batches of 1-3 bytes each. I'm not sure how RtMidi handles that, but probably we'd be better off with generalizing kernelMidi::send to work with messages of arbitrary length and increase chances that RtMidi will pass them as a single message, just as intended.

Same problems with received messages. MIDI devices tend to work with internal state machines that don't care about small delays between consecutive bytes, and again, I'm not sure what RtMidi treats as a single multi-byte message, and what might as well arrive to Giada broken into a few pieces. I'm especially worried about MTC full frames, which are 10 bytes long.

That being said, midiDispatcher will play a major role in a ton of changes I'm working on. Message lengths will vary from single-byte Midi Clock, through variable-length CCs, to 10-byte MTC frame. The question is whether we can blindly trust in RtMidi's sanity that messages passed to Giada will be always parsed into the logically consistent chunks. Or if we should aim towards building internal buffer and enforce correct parsing and dispatching. The answer is probably somewhere in RtMidi's docs. :)

Anyway, just wanted to notify everyone what I plan to do, hopefully get a green light from @monocasual, let them tag the issue properly, get an issue number to name the branch somehow, listen to some opinions et cetera.

tomek-szczesny commented 4 years ago

Worth noting:

"The RtMidiIn class provides the RtMidiIn::ignoreTypes() function to specify that certain MIDI message types be ignored. By default, system exclusive, timing, and active sensing messages are ignored."

tomek-szczesny commented 4 years ago

Another mandatory feature of MIDI receiver: Support for Running Status. http://midi.teragonaudio.com/tech/midispec/run.htm

tomek-szczesny commented 4 years ago

Okay, I went totally berserk about my ideas and took a liberty of crafting a dedicated wiki page to stop spamming github issues and forum ;) https://github.com/monocasual/giada/wiki/MIDI-overhaul

monocasual commented 4 years ago

Thanks @tomek-szczesny for the Wiki entry, I'll take some time to process it and reply back on here. Regarding your low-level MIDI questions: yes, kernelMidi currently emits and receives only 3-bytes long MIDI messages.

I'm not 100% familiar (yet) with the MTC full frame and we might probably ask this question to the RtMidi guys over GitHub (their repo).

I'll keep you posted.

tomek-szczesny commented 4 years ago

No worries, @monocasual, take your time. We're all busy people. :) Thanks for the link, I'll ask them everything I need once I'm sure what I need.

The wiki page is a subject to continuous changes. I did some research of MIDI protocol and read a bunch of C++ resources to actually be able to pull this off. Every time I learn something new I update the thing.

tomek-szczesny commented 4 years ago

Mmh. Wiki page got too long to handle, I created another one for midiController. https://github.com/monocasual/giada/wiki/midiController-in-greater-detail

monocasual commented 4 years ago

@tomek-szczesny here we go. First of all thank you for the effort and the thorough proposal. I like your sender-receiver idea. Before diving head-first into the MIDI overhaul I need to understand the current situation, what's missing from the current engine and what should be done to bridge the existing gap(s). In other words: what problem(s) are we trying to solve?

Concretely, the core issue lies in the 3-bytes messages limitation. If I understand correctly, that's not enough to receive or send well-formed MTC messages, also known as MTC full frames. If so, this would require some sort of buffering and the ability to differentiate them from regular MIDI messages. Is this correct?

Are there other primary issues or weak points that require us to rethink the current architecture? What are the benefits it would bring us? I believe these are important questions to address first.

P.S. Don't worry about the C++ struggles :) As soon as we have a sound theoretical model, we'll find the best way to implement it. Where "best" means: easy to understand, easy to maintain, easy to expand.

tomek-szczesny commented 4 years ago

I'm grateful for your time and your feedback. :)

There are many problems and limitations that I attempt to solve.

My proposal covers issues that are already here: Multiple MIDI inputs and outputs #215 , Master/Slave MIDI sync with multiple outputs #137 #177, bidirectional CC support for channels #357, mapping enhancements #384, to name a few that I didn't propose myself. From my point of view, there is much to be explored in terms of controller support and code flexibility.

Multiple MIDI channel support calls for a complete overhaul on its own. Existing code (midiDispatcher!) is centered around a single input/output pair. This is acceptable to maintain, but really hard to expand.

Moreover, I aim to keep all MIDI affairs within midiSomething.cpp files. Right now I believe that this is not always the case (clock). It is a subjective matter, but I find some solutions messy - lightning is handled in kernel, learning logic is in dispatcher, along with discrimination what messages should or should not be forwarded to which channel.

In order to achieve maximum flexibility and performance, I set off to define a layer stack, where every mechanism has its own place in the hierarchy. The lowest layer should provide communication via MIDI ports. midiDispatcher should dispatch messages and do nothing else, etc.

Also I find it disturbing that there is a lack of layer between incoming raw data and applying actions to Giada - there is a Dispatcher that does most of that, but is designed to cover only a few use cases. It might be me, but I think midiDispatcher::processChannels() is not very expandable. That's where I gave hours of thought to define midiController module that adds a logic layer to do advanced stuff using controllers.

3 byte long messages are a really big problem. Perhaps not the core issue, but simply one of the necessities that need to be fulfilled to move forward. MTC Full frame is 10 bytes long, that's true, and sending it in 3-byte packets is risky. SysEx messages can be of arbitrary length, for example for asking the midi controller about its make and model. NoteOnOff messages may be two bytes long, not three, if the first byte is skipped (so-called "running status", a mandatory feature according to the standard). Any received MIDI message can hold an extra Real Time (for example Midi Clock) message byte, breaking the consistency of anticipated message. This is a rare case with virtual instruments, but it might happen quite often with physical devices that send data at limited rate. To solve all these problems and cover all possible future uses, I suggest keeping the whole vector received from RtMidi, and fix it if it's broken into pieces or mixed with real time messages. It's not hard to keep it backwards compatible, as envisioned in Stage 1.

Perhaps the best way to describe the advantages is to follow my implementation stages draft.

1) A new midiPorts module would be ready to create multiple connections to the outside world. Messages would be parsed according to standards we don't support yet, packed into neat midiMsg class instances, and would hold any possible midi message for future use. 2) midiSockets would allow to interface any midi-aware module, be it a physical controller or a Giada channel. midiDispatcher would operate with midiSockets, unaware of what is a source or destination of the message. midiDevice would be a placeholder for handling global midi messages, like transport. At this stage, inter-channel routing or external message bypass would be very easy to implement. 3) multiple MIDI ports 4) Master/Slave midi clock / MTC support, that can be routed outside Giada as well. 5) midiController file would dynamically create and remove bindings in midiDispatcher and take over lightning support to a higher level, closer to channels in the hierarchy. Incoming messages would go to their destination on the shortest possible route, and no receiver would receive unwanted messages.

6,7,8) New era of midi controller support. ;)

The most exciting part I'd like to point out is that all modules that I design can handle all possible midi messages in all possible directions - there are no limits in future feature expansion I can think of. I also try to apply a good programming practice to avoid any exceptions in handling anything.

All that being said, I admit that I study C++ in my free time to cope with this task, and I should be able to do most, if not all of this work. Any help would be appreciated though, and if there is a way to create a cooperative workflow, I'll be grateful for any tips on how to organize this concept into a workable documentation.

monocasual commented 4 years ago

My proposal covers issues that are already here: [...]

This is good to know.

All in all, I agree with you when you say that the current architecture would benefit from more abstraction. In my vision I would love to obtain something like this:

MIDI io

This is a very high level view of MIDI management. The first row depicts MIDI input: the raw message gets processed by a separate MIDI module (the MIDI layer) which outputs a MIDI message class. Such class contains all the information needed for the audio engine to act accordingly, along with the original MIDI data. Same thing in reverse for the second row, depicting MIDI output. The good thing about this design is that everything MIDI-related is contained and managed by the MIDI layer. No more components scattered around as it is now (clock, dispatcher, ...).

If I understand correctly your proposal, the design above should go along with it pretty nicely. Assuming this, the bright side is that you don't have to touch too much of the existing stuff in the engine (besides reorganizing clock and other "embedded" parts). It would be great if you could provide two examples - one for MIDI input and another one for MIDI output - with the multiple layers in action and how the MIDI messages flow across them.

Final note on a possible downside of the whole operation: being a overhaul, the changes to the code base will be significant. We might not be able to include it in the 1.0 release, let alone blindly merge it as-is, without adjustments. So once again my suggestion is not to focus on writing perfect C++: that's something we can fix later on. A working prototype would be cool enough.

Thanks for the hard work!

tomek-szczesny commented 4 years ago

Alright, I made a quick graph of my proposal:

midioverhaul

I made it because it is much more detailed and less generic.

midiMsg class that I use is very different to what is on your graph - it does contain a message vector and a collection of methods to parse them by a sane receiver. It's up to a receiver to deal with the message and call appropriate methods in audio engine. That's why there are numerous receivers that deal with different kinds of messages. midiMsg class also contains sender and receiver information, and a timestamp, so one might say it's a dispatch-oriented object.

A quick explanation to one part that might not be clear at the first glance: midiSockets are specialized objects that interface their parent with midiDispatcher. Each parent object type will typically have a different midiSocket, that handles received messages in a different way. Most notably, they will call different functions in their parent objects. midiSocket may convert midiMsg instances into something else, but I never used that approach so far.

Note that it's easy to create a new object, along with its own midiSocket, and bind it to midiDispatcher, to implement any new features - without touching an audio engine or any existing part.

Here are some cases:

  1. Giada is initializing.

According to Project settings, midiController creates a binding from Launchpad to itself, so all messages from Launchpad will be forwarded to midiController by default, since it's providing a logic layer for this dumb controller.

  1. A MIDI Channel is created. Channel state messages goes to midiController that grows aware of its existence. midiController automatically creates binding to a button on channel grid controller (Launchpad), if applicable, and holds this information in itself. midiController refreshes lightning on that controller's button, by sending a MIDI packet addressed directly to the controller's socket (in this case, midiDispatcher forwards a message to the desired destination without consulting its routing tables). Since midiController sends a packet directly to Launchpad's address, no extra binding in midiDispatcher is necessary.

  2. A MIDI channel is configured to accept messages from Keyboard X, channel 4. This information is passed to midiController via Channel state message. midiController creates three binding in midiDispatcher: All NoteOnOff messages on channel 4, from Keyboard X socket, must go to appropriate MIDI Channel socket. Similar bindings are created for CC and Pitch Bend. Message from Controller X passes through RtMidi lib, goes to midiPorts where it is examined for consistency and fixed against numerous standards. As a midiMsg class instance (which should hold only legal messages), it is passed to Keyboard X's midiSocket. midiSockets always pass messages to the midiDispatcher if they come from their parent objects. midiDispatcher consults the received message against its tables, and finds a match with some MIDI Channel (could be more than one). It forwards the message to all fitting recipient's sockets. MIDI Channel's socket receives a midiMsg frame and does whatever it does as of today.

  3. Midi Clock Setup A user selected synchronization to MTC signal incoming from port Y. At the same time, Keyboard X accepts Midi Clock (MC) messages to control its internal arpeggio capabilities. midiController knows that from Keyboard X's config file. Moreover, a user wishes to send MC to some other software S. By default, midiClock is bound to all MC and MTC signals (except from itself, but midiDispatcher shall never send the message back to its sender). Also by default, midiClock generates both MTC and MC, and send them to Dispatcher. Again as default, these messages are not routed to anywhere, so they get discarded at this point. The point is to ALWAYS have MTC and MC frames in the Dispatcher - either self-generated or externally sourced. In discussed example, midiClock calls midiDispatcher to unregister itself from any bindings that treat it as a receiver. Then it calls midiDispatcher again, to route MTC signals from port Y to itself. Again, calls midiDispatcher to discard any MTC messages coming from other sources than Port Y (make an exclusive binding to send them to some sort of "/dev/null" receiver :) ) And again, midiClock calls midiDispatcher to start routing MC signals from anyone to software S. In the meantime, midiController independently calls midiDispatcher to bind any MC signals to Keyboard X. Since MTC signals are received, midiClock does not create or send this kind of signals. However it still does generate MC signals and broadcast them to the Dispatcher, for anything that would utilize them.

In the future, midiClock might try autodetecting incoming clock signls to simplify configuration.

  1. ( Future) A Sample channel in Instrument mode would love to play some MIDI notes. Since it is incapable of holding midi notes in itself, it could play notes stored in MIDI Channel. Moreover, it could respond to live presses on physical Keyboard X.

So, in channel's setting a user picks a MIDI Channel AND Keyboard X as its inputs.

That information, again, goes through Channel State to midiController, that creates appropriate bindings.

Note that this is possible to create midi message loops this way, by binding two MIDI channels back to back. Either a midi Dispatcher sanity check should refuse creating such bindings, or we might as well let them be - I did that once in Seq64 and learned my lesson. ;)

I guess I dived into great details, AGAIN. Sorry about that.

PS. About the implementation effort, I'm not completely sure how bad this is to code it all. I feel that most of the legacy code will be used anyway, namely:

The original work would involve a brand new midiDispatcher and vastly expanded midiController. The former won't be too complicated, as I really want it to offer no sanity at all. Maybe even a single function will do? The latter, well, I'm up to a challenge of coding logic to the controllers.

tomek-szczesny commented 4 years ago

In mid-September I'm going to spend a week in Czech mountains, and I plan to bring a netbook along to entertain myself during lonely evenings. It would be awesome to get some feedback in next two weeks, so I could do some proper work on Giada during that time.

monocasual commented 4 years ago

Hey @tomek-szczesny , sorry for the delay. Vacation time + hardware upgrade is a deadly combination for development :) I took some time to study your design and I'm trying to sum it up in my own words, to better understand it. Please be patient and let me know if I got it right.

MidiMsg

An enriched MIDI message container. This is how MIDI information is passed around various modules in the new architecture.

MidiDispatcher

The central point of the architecture: a central dispatcher that relays all MIDI messages in Giada, both exchanged with outside world, and internally. Other modules register themselves as message receivers. MidiDispatcher then gets a MIDI message and forwards it to all the receiver modules.

MidiSocket sub-module

Each module owns a specific MidiSocket. It's primary objective is to parse incoming messages (the MidiMsg objects) and call appropriate procedures in the module it belongs to.

MidiPorts module

A simple backend for MIDI communication with external world. This module talks to RtMidi directly.

MidiClock module

Manages MIDI sync, both master and slave.

MidiDevice module

Manages global MIDI messages.

MidiController module

A generalized definition of physical MIDI controllers. It deals with MIDI learning, channel control, lightning messages and other hardware-related features.

Questions/issues:

  1. How does a MidiSocket interact with a module? Please provide an example
  2. What is the benefit of having one MidiSocket for each module? Why not parsing the message right into the module itself?
  3. What is the big box called "Sample Channels, MIDI channels, Instrument channel (?)" in the graph?
  4. What are "channel events" and "channel state" in the graph?
  5. How does a module interact with the audio engine? E.g. I press a button on my MIDI controller in order to play a Sample Channel: what is the actual workflow? (I guess this will somehow answer the two previous points above)
  6. The whole MidiController thing looks massive. My concern is that the design is trying to solve too many problems at once. I would suggest, for a first implementation, to avoid the "MIDI device educated guess" feature. MidiController can still be a module that deals with hardware-related stuff (learning, lightning, ...) without fancy device recognition.
tomek-szczesny commented 4 years ago

Hi!

Thanks for your time. I hope you had a great time on holidays and setting up a new box!

Yes, you got all basic features right! And you raised very good questions indeed.

1 & 2: midiSocket is not a well defined idea yet because I'm not confident about actual implementation of it. The reason I wanted them to be separate entities was to make them identical in terms of communication with midiDispatcher. Dispatcher would send a message to a midiSocket by using a pointer to it, but maybe a similar mechanism could be obtained with just a unified midi parsing method within each module. However, I don't know whether this would work in case of Giada channels, multiple instances of the same class. I feel that each instance should have its own copy of midiDispatcher interfacing code, for the sake of easy "addressability". It all boils down to a question: Upon constructing a new instance of a Sample / MIDI Channel, will a member function be created as a separate copy in memory?

  1. The big box with sample and MIDI channels is an existing implementation of Sample channels and MIDI channels, especially src/core/channels files. I mentioned it somewhere that if it comes to MIDI stuff, I imagine them receiving only musical notes routed directly from external devices (or exchanged between each other?). Learning, control, lighter etc should be handled by midiController module.

  2. Channel events are commands that midiController uses to control channels, pretty much like midiDispatcher does it right now, by calling /glue/events.cpp methods. Channel State is basically m_channelState->playStatus.load();, sent to midiController on each change to prevent it from polling that information. It is necessary for lightning.

  3. If you press a button and expect a channel to play, that means that midiController module is aware of this binding and has already registered itself as a receiver of such a message in Dispatcher. When the message is received by midiController, it will call c::events::pressChannel(blah)., pretty much just like midiDispatcher does it right now. In general, I plan to interface with audio engine using exactly the same methods that are used as of today - most notably in midiDispatcher and clock.

  4. midiController is indeed a big project on its own, this is why I split its development into four different stages (5 to 9). The first stage (Stage 5) covers only currently existing functionalities of Giada and is fairly simple. After that I'll attempt to implement features I have in mind. Keep in mind that the whole implementation plan is divided into stages. Each stage results in fully functioning Giada, even if the new MIDI engine is only partially implemented. I plan to tag my commits according to this plan in my midi-overhaul branch (on my github account).

tomek-szczesny commented 4 years ago

I just realized that using pointers as addresses is no good - it won't work with saving and loading projects. So yeah, I guess we may scrap the idea of midiSocket for now - I'll get back to the drawing board and think of something else. I'll investigate existing MIDI bindings code and look for some ideas there.

monocasual commented 4 years ago

1 & 2: midiSocket is not a well defined idea yet because I'm not confident about actual implementation of it.

If you really need a sender-receiver/broadcaster-listener architecture, you could create two generic objects, say Broadcaster and Listener that provide the ability to send and receive a message (i.e. a MidiMsg object). Then you let the MidiDispatcher and each module inherit from them. Or you can favor composition over inheritance and just give MidiDispatcher and each module an instance of those objects.

  1. The big box with sample and MIDI channels is an existing implementation of Sample channels and MIDI channels

Got it. I usually think of channels as parts of the audio engine - that is, everything that is not MIDI-related.

  1. [...]

👍

  1. Controller module is aware of this binding and has already registered itself as a receiver of such a message in Dispatcher. When the message is received by midiController, it will call c::events::pressChannel(blah).

Perfect. The audio engine should always be handled through the c:: namespace (aka the glue stuff).

  1. [...]

I really like the idea of splitting the development into working stages. This will help a lot. Thanks!

tomek-szczesny commented 4 years ago

If you really need a sender-receiver/broadcaster-listener architecture, you could create two generic objects, say Broadcaster and Listener that provide the ability to send and receive a message (i.e. a MidiMsg object). Then you let the MidiDispatcher and each module inherit from them. Or you can favor composition over inheritance](https://en.wikipedia.org/wiki/Composition_over_inheritance) and just give MidiDispatcher and each module an instance of those objects.

If I understand you correctly, this is pretty much close to my midiSocket idea, except this was meant to be a Broadcaster and Listener combined. Thanks for the details though. :)

Yesterday I wrote that post where I noticed that this is "impossible" to store and load project files with pointers in them, since obviously Giada will have a different structure in memory. I thought that I might overcome this problem by investigating existing code, namely how channels are handled by their IDs, but as far as I understand this correctly, channels simply get new IDs upon reloading a project, as they get recreated. So, addressing channels by their IDs won't work either since they change their IDs upon loading. Am I correct here? Well, I guess we COULD reconstruct midiDispatcher's table by applying fixes as channels change their IDs upon loading, but that's not the cleanest solution. Or is it?

The audio engine should always be handled through the c:: namespace (aka the glue stuff).

This is good to know - I assume this works both ways and channels should send their status to midiController via glue as well?

keryell commented 4 years ago

For the serializing part you might rely on FlatBuffer or Cap'n Proto if you want something efficient while not handling with pointer directly. But this might be overkill for this project.

tomek-szczesny commented 4 years ago

I think that using yet another dep for this would be an overkill indeed, but thanks for sharing a keyword with me - I googled "serialization" and got to some reading on topic.

monocasual commented 4 years ago

as far as I understand this correctly, channels simply get new IDs upon reloading a project, as they get recreated. So, addressing channels by their IDs won't work either since they change their IDs upon loading. Am I correct here?

Actually you can address channels by ID: that kind of information is stored in the project json file. This is how we provide persistence for connections like plug-ins to channels, channels to columns or Waves object to channels. On the other hand, new channels get new IDs starting from the most recent value - i.e. if you have channel.id = 4 in your project file, the next channel will have channel.id = 5. This logic is managed by ChannelManager here through the IdManager helper class.

I assume this works both ways and channels should send their status to midiController via glue as well?

Well, that's a good question. I don't think is necessary though. Internally, engine parts are allowed to talk to each other directly, with no intermediate layers. Example here. It depends on how "internal" the MIDI layer is. I'll think about it.

tomek-szczesny commented 4 years ago

Actually you can address channels by ID

That's great news! Thanks for explanation. I saw referring to channel by its ID somewhere, but can't find it now. I guess I wasn't paying enough attention while inspecting storage code.

I was thinking that perhaps I should drop the whole "pointer is an address" idea and just assign fixed 32-bit unsigned integer addresses to "permanent" modules like midiClock and midiController, and address ranges for channels and ports. If channels get consecutive numbers as their IDs, and I'll probably make ports follow this pattern, we might end up with some sort of prefixes for them, to make things nice and simple. This way, midiDispatcher would have some extra work resolving these addresses rather than calling a function by its pointer, but that will surely pay off when serializing and deserializing midiDispatcher tables. My programming background is mostly coding microcontrollers in assemby, so please tell me when my ways get too procedural. ;)

I stick to my idea od sender/receiver scheme and the whole addressing business for the sake of code flexibility. I described practical cases of sending MIDI clock to hardware controllers, or interconnecting channels in Giada itself, and who knows what other combinations someone will think of in the future. The only way to avoid a terrible mess in code is to do the sender/receiver thing, or a connection matrix, either way one pictures it.

Internally, engine parts are allowed to talk to each other directly, with no intermediate layers. Example here. It depends on how "internal" the MIDI layer is.

Let me ask you a stupid question first: Is glue related to threading in any way, or just tidiness? Either way, it doesn't sound like a good idea to let midiController talk to channels through glue, and let them respond to midiController directly.

monocasual commented 4 years ago

Let me ask you a stupid question first: Is glue related to threading in any way, or just tidiness?

Not stupid at all. Yes, some glue:: parts are related to threading. For example, functions in glue/events.h don't talk to the engine directly. Rather, they push "events" to a lock-free queue that will be consumed later on by Mixer (one of the many engine components). This is done to guarantee thread-safe communication - usually between the UI thread and the audio thread.

Either way, it doesn't sound like a good idea to let midiController talk to channels through glue, and let them respond to midiController directly.

Makes sense to me.

tomek-szczesny commented 4 years ago

Okay, thanks for all the information so far! I'll do my best to create glue files for midiController and midiClock, since those will get a lot of feedback from the audio engine. Not sure about midiDevice just yet.

Right now I'm trying to solve a problem of numbering ports. Upon investigation I found a bug, by the way.

As far as I can tell, Giada gets a ports list from Rtmidi and uses item numbers on this list as a port reference in config file. This doesn't work too well when other programs create and remove ports and shuffle Rtmidi's port lists. Moreover, Giada creates its own ports that make it to the list, causing even more havoc and often connecting to wrong ports since the port list got reordered. Frankly I had to restart Giada 3 times to connect its input and outputs the way I wanted.

The solution, obviously, is to refer to MIDI ports by their full name reported by Rtmidi, both in config file, and internally. Who said addresses must be numbers, after all?

EDIT: Yet again I published some changes in Wiki, most notably a chapter about handling multiple MIDI ports. I also work on feature checklists for each development stage and continuously correct layer definition and implementation chapters. Oh, and I scrapped old midiSocket. Not necessary anymore since I pushed a task of resolving addresses to midiDispatcher.

tomek-szczesny commented 4 years ago

I had a fever for a past few days, so I stayed in bed and played around with my No-GUI lappie. So I've set up vim environment that compiles edited files on the fly and picks up my stupid newbie errors pretty well. ;) So on Sunday I set off on my vacation trip to nowhere and I won't be able to visit github (doesn't support lynx for whatever reason :D). However I set up git, so I can still work on code and wiki.

tomek-szczesny commented 4 years ago

Hi hello, I'm back! Phew, countless kilometres and beer pints on the mountain trails left me with not much time left for coding itself, but I indeed had a lot of time to think about it ;)

So, here is the current state of work. I implemented midiMsg class and midiPorts module, that hopefully is able to handle multi-port communication right now. There were numerous obstacles on the way, so here's what changed in the general idea:

So, I'm very happy with my progress so far and I learned tons of C++ in the process. Keep your fingers crossed, I'll try finishing stage 1 and see if it works. :)

EDIT: So yeah, my file do compile by a syntax consistency checker, but don't compile when using "make". What should I do to make my new files included? Modify makefile.am by hand?

EDIT: I added a Work in Progress Pull Request, as suggested by @keryell, so feedback can be given in a clean manner.

EDIT: So I guess Stage 1 is done and pushed for everyone to see! Woo! I am so surprised I'm almost moved that 98% of my code written on my holiday trip actually works without a hiccup.

I found out that MIDI lightning doesn't work too well on current release, which makes me think it'd be nice to pull this whole overhaul thing off before 0.18...

monocasual commented 4 years ago

Welcome back Tomek! I'll take some time to study your changes in the next few days. My fingers are crossed, I'll keep you posted :+1: EDIT: remember to rebase your PR regularly over master, as we are pushing many CMake-related changes in the meantime (mostly). Feel free to ask if you need help

tomek-szczesny commented 4 years ago

Thanks! No hurry with your review though, this is WIP after all, but it's going to get more complex the longer you wait. ;) I still have leave of absence until the end of the week, so some further developments may be expected.

I managed to merge monocasual/master into my midi-overhaul once with some github button but can't find it anywhere now. I guess it had something to do with one conflict that I resolved manually. I think I'll rebase at least every time my PR is not able to merge automatically anymore. For now it compiles and works, that's all I need for testing. :)

EDIT: Now I'm working on midiDevice, a module handling global messages. In current midiDispatcher code there are numerous calls to model namespace - I've seen these files before, but their purpose is a mystery to me. Could you briefly explain what they do? Thanks!

tomek-szczesny commented 4 years ago

Currently I'm dissecting old midiDispatcher and I think I'll have to change my plans slightly. Its code will be split between midiController, midiDevice and midiLearner. I think I should devote some time to update wiki at some point...

(I wouldn't mind answer to a previous question :) )

monocasual commented 4 years ago

EDIT: Now I'm working on midiDevice, a module handling global messages. In current midiDispatcher code there are numerous calls to model namespace - I've seen these files before, but their purpose is a mystery to me. Could you briefly explain what they do? Thanks!

The model:: namespace contains... the data model: in other words, the central data structure where channels, audio samples, plugins, actions are stored. Specifically, the namespace contains the model itself and some functions to operate on it in a thread-safe and lock-free way. You have to call those functions if you want to read or alter the model. The "how" part is a bit tricky, let me know if you want more details. :)

tomek-szczesny commented 4 years ago

Um.. Thanks, I'll try some reading on c++ models, maybe? Anyway it's probably not much I should worry about at this point.

A quick question though: Do we use c++20 already, or should I stick to C++17?

tomek-szczesny commented 4 years ago

Just to make sure: Is midiDispatcher::signalCb_ used only to trigger ActionRec right now? Yet again, this is something I didn't quite expect and doesn't fit to my original concept, so I'll exercise the modular capabilities and just make a new submodule for that: midiSignalCb. Yeah!

monocasual commented 4 years ago

Um.. Thanks, I'll try some reading on c++ models, maybe? Anyway it's probably not much I should worry about at this point.

Well, that's just custom stuff we built to provide lock-free multithreading in Giada.

A quick question though: Do we use c++20 already, or should I stick to C++17?

We are currently on C++17, better stick with it.

Just to make sure: Is midiDispatcher::signalCb_ used only to trigger ActionRec right now?

Yes, for the record-on-signal mode (more information in this page).

keryell commented 4 years ago

I agree this project should use C++17 only. C++20 is not fully supported yet and we might not require end-users to install another compiler different from the default system one. Perhaps in 2 years from now? Honestly, I am already amazed to discover C++17 and some modern stuff used in this project. :-) I have seen so many projects where this is basically some C code from 1972 inside a file with a .cpp extension...

tomek-szczesny commented 4 years ago

All I know is that it's hard to get a grasp of a language under 40 years of constant development in a month, given that my only prior experience with C++ was one college course, and that was before C++11. Like I said, I'm not a programmer. ;)

Okay, I asked because I had to do some fine tuning of vim so it won't complain about too modern stuff in there.

Thanks for your support guys, now I ought to clean up this code before I move on to the next stage. :)

tomek-szczesny commented 4 years ago

Help wanted...

Since it seems that all MIDI systems use different names for the same devices, I'll implement some sort of unification, so a Giada project with saved bindings, with device names included in them, could be transferred cross-platform (or between Jack and ALSA to say the least..) For this I need a sample of a device name as reported by all four supported systems: Alsa, Jack, Apple's Core and Windows MM. Preferably a name of the same device on them. I can check Alsa and JACK, and haul my Launchpad to work and check out Windows. I don't know anybody who happens to have Mac (not so popular on this side of the world I think).

tomek-szczesny commented 4 years ago

Here's another curio: ALSA and JACK send different data to Giada!! :D (This is actually a correct behavior - confirmed with MIDI sniffer in both modes). Namely, playing around with my launchpad I noticed that button bindings get shuffled up when switching MIDI system in Giada. It turns out that NoteOffs from my Launchpad have value fields = 0 according to Alsa, and 64 when using Jack. I wonder which one of those is actually correct.

I guess I might as well default NoteOff value field to zero, since it's never going to be interpreted anyway. What do you think?

tomek-szczesny commented 4 years ago

Pardon the bombardment of this thread, but here's another question. We use RtMidi 3.0.0 and 4.0.0 is already available. Can we upgrade, and if so, how do we do it? The reason is that I re-found this bug, that was fixed after 3.0.0, and it'd be really nice to have it working ;)

tomek-szczesny commented 4 years ago

midioverhaul

Here's a new graph for wiki page. Too bad I don't have my own hosting anymore ;)

Here's what changed:

midiBindings are json-serializable objects returned by midiLearner. These represent a user-defined MIDI controller triggers that are expected to do something. So, it stores information such as "some specific NoteOn from controller x", or "a CC parameter change on controller x while holding a button on controller y". It houses its own buffers and logic to handle all supported cases. Messages can be checked for compliance with a binding using check() method, and collect a return value if applicable. It supports triggering on button push or release, CC changes (i assume all knobs and faders use CC), and all combinations with another button being held at the same time.

monocasual commented 4 years ago

@tomek-szczesny I can't help with the physical part right now, as I don't have any MIDI device available at the moment. You could try with your launchpad in a virtualized macOS maybe? This script helps a lot in the installation under Virtual Box.

We use RtMidi 3.0.0 and 4.0.0 is already available. Can we upgrade, and if so, how do we do it?

The problem is that Ubuntu Bionic (the distro we are using on Travis for building binaries) still ships RtMidi 3.0.0. Things will change with #315 and #387 though, both planned for 0.17.1.

I guess I might as well default NoteOff value field to zero, since it's never going to be interpreted anyway. What do you think?

Isn't it the old problem of NoteOff sent by some devices as NoteOn with velocity = 0 or something like that?

tomek-szczesny commented 4 years ago

The problem is that Ubuntu Bionic (the distro we are using on Travis for building binaries) still ships RtMidi 3.0.0. Things will change with #315 and #387 though, both planned for 0.17.1.

I think that even Ubuntu Groovy is stuck with RtMidi 3.0.0. If I understand you correctly, all dependencies will be freshly pulled from respective github masters on each git submodule update, or will they rely on specific tags?

You could try with your launchpad in a virtualized macOS maybe? This script helps a lot in the installation under Virtual Box.

Orrr maybe I'll just leave it as TODO for now. ;) Handling multiple ports opened a lot of nasty cans of worms. Ports change their order in RtMidi ports list. Bindings must include port names. Heck, even ports change their names if you plug the same controller to a different port. Every MIDI system use different port names conventions, and who know how will those change in the future. I admit my development effort had to slow down for a while to rethink the strategy to handle all this, once and for all.

Isn't it the old problem of NoteOff sent by some devices as NoteOn with velocity = 0 or something like that?

Nah, that's a different story. Some controllers send NoteOn with zero velocity, and by that they actually mean NoteOff. This trick is actually allowed by MIDI standard, because it is cheaper in terms of data rate to send many messages of the same type in a row (using Running Status).

The problem I discovered is that NoteOff messages have different velocities as seen by Jack and Alsa. I even captured both at the same time using two instances of MIDI monitors (which hopefully are not RtMidi based). I suspect that Alsa just strips NoteOff velocity as this is a truly obscure (although valid) feature of MIDI standard. However, the same standard recommends NoteOff value of 64 if release velocity is not implemented in the controller. I think I'll just default the NoteOff value to 64 for now, as Giada never parses this value anyway. Nevertheless I'll leave a remark on my TODO to look into this issue.

keryell commented 4 years ago

I am lost on the RtMidi Ubuntu & Debian. Why is the package named librtmidi4 if this is version 3? Is this a packaging bug? I checked on my Debian/unstable laptop and /usr/share/doc/librtmidi4/changelog.gz actually talks only about version 3.0.0... Actually in Debian/experimental there is a https://packages.debian.org/search?keywords=librtmidi5 which is RtMidi 4.0.0... :-/

keryell commented 4 years ago

For the NoteOff velocity difference, on Linux, Jack is based on ALSA, isn't it? So if ALSA was discarding some stuff, why would Jack get the information? Curious...

tomek-szczesny commented 4 years ago

Why is the package named librtmidi4 if this is version 3?

I don't even attempt to make any sense out of it. The package name has a number, the package itself has a version, and who knows what's the version of its contents :) That's why I'm interested in exact mechanics of depending solely on github sources.

For the NoteOff velocity difference, on Linux, Jack is based on ALSA, isn't it?

That's a good point, never thought of that. So there you go, you provoked me to get to the bottom of this. ;)

A low-level ALSA tool, aseqdump, returns something like this on a single button push:

 28:0   Note on                 0, note 102, velocity 127
 28:0   Note off                0, note 102

According to source code, this formatting of "NoteOff" message actually means that NoteOn with Velocity = 0 has been received from ALSA, just converted by the tool itself.

Jack, however, "fixes" that for you.. Basically it replaces zero-velocity "NoteOn" messages with NoteOff and velocity of 64.

Great... So I guess the solution would be to fix NoteOn and NoteOff with zero velociy to NoteOff with velocity 64, just to be sure. Should any MIDI system send NoteOff messages with an actual meaningful velocity, those will get through.

monocasual commented 4 years ago

That's why I'm interested in exact mechanics of depending solely on github sources.

@tomek-szczesny as you correctly assumed, the dependencies will be freshly pulled from submodules on git submodule update. However, the main repository (giada) is configured to point to a specific commit for each submodule, so the changes will be pulled from there - not from submodule's master branch.

Bonus point: CMake is able to deal with git submodules, so the whole process can be executed at configuration time. This stuff will be investigated in #387 . 🤞

tomek-szczesny commented 4 years ago

Okay, so I came to a conclusion that multiple port support is the most necessary feature now - without that I can't even fully test everything I've done so far.

I'd like to ask how do we envision this from the user's perspective.

First, a config window. Instead of Choice widget, we'll need something else to pick more than one input and output port. FLTK does not provide anything like Check_Choice, which would make a perfect fit in there, and I don't think I can make one. There is a readily available Check_Browser that is a scrollable list with checkable items, but it is definitely not minimal. There are more problems though. How do we associate midimaps to ports (pairs of ports!), how do we select other options and tweaks for each port, like "get clock from here", "send clock to there", "This one is the default input for channels" and whatever we'll come up with in the future. AFAIK there is no other software that would allow for such a freedom and flexibility, thus nothing to inspire on. Perhaps it could be solved by tables of settings?

Secondly: How do we handle the fact that port names change. Both ALSA and JACK add some numbers to port names, which seem to be unique and change when you plug a controller to different USB port. When you switch between Jack and Alsa, the number is the same, but formatting is completely different, an like I stated before, I'd rather not depend on it not changing in the future. I'm looking for a solution not involving reconfiguration of global binding and especially project bindings when this happens. It wasn't much of an issue with a single MIDI port and port-agnostic bindings. I've got three ideas so far.

  1. Let users give "nicknames" to ports right from the start - if a port changes its name, reassigning a nickname to another port would restore everything to normal.
  2. Pop up a window, that some ports seem to be missing and some others showed up, and let the user manually tell which one is which - this will let the user know that something is wrong right from the start, but might be more confusing.
  3. TRY guessing which port changed its name to what, by implementing a neuron network AI sourcing computing power from peer-to-peer network specifically built for this task. In other words, something that would do the user's choices from the previous point, but reliably. Personally I think that port nicknames might be a very good solution - users might choose meaningful names that will show up in other parts of a program, and make inconsistent binding problems less severe.

Okay I thought this list would go on, but that's all that I need to know for now. Please share your thoughts. :)

tomek-szczesny commented 4 years ago

I crafted one proposition of config menu tree:

  1. General settings 1.1. Default channel input 1.2. Default control surface 1.3. Etc..

  2. Input ports 2.1. (first detected input port) 2.1.1. Enable (check button) 2.1.2. Short Name (custom name the user can give to a port) (text) 2.1.3. Clock input (none / MC / MMC / any) 2.2. (another detected port, following the pattern)

  3. Output ports 3.1. (First detected output port) 3.1.1. Enable (check button) 3.1.2. Short Name (custom name the user can give to a port) (text) 3.1.3. Clock output (none / MC / MMC / both) 3.2 (more ports if available)

  4. MIDI maps 4.1 Midi Map 1 4.1.1 input port 4.1.2 Output port 4.1.3 Options 4.2 Add (another midimap from a collection of found maps)

Just to clarify, we need to allow more than one midi map if we plan to use many controllers. Each must be pointed to input and output port of a controller. "Options" menu, or just a few items in place of "options", could allow some future uses. One I can think of is sending a configuration SysEx stream to a controller that should not repeat on every run, like sending a predefined controller template for more advanced devices. "Add" option is preferred instead of a full list of available midimaps, because it will become problematic one we have more than 5 maps available.

So I explored my idea of making a table for all these settings, but digging through a FLTK Table class documentation I gave up after just looking at it. ;) Instead, I think that top level options could be tabs. Second level options (for each input and output port) should be something else than tabs, due to their long names, unfit for tabs. The same probably applies to midimaps as well.

How about a tree on the left, pretty much like Ardour options window?

monocasual commented 4 years ago

First, a config window. Instead of Choice widget, we'll need something else to pick more than one input and output port.

This is easy (I assume). You can implement what is currently available in the Sample Channel menu, that is the drop-down menu that pops up when you click on a Sample Channel's main button. The code is here. In other words, a button that, when clicked, shows a menu with multiple checkboxes. What to display in the button when more than one port is selected? I'd go with something like "[multiple ports selected]". The same can be done for midimaps of course.

Secondly: How do we handle the fact that port names change. Both ALSA and JACK add some numbers to port names, which seem to be unique and change when you plug a controller to different USB port.

Weird, I can't reproduce such behavior right now. Anyway, what if we just do nothing internally and simply notify the user to update the setup? I mean, I have rearranged my physical configuration so I would expect to make some work in the software to reflect the changes. I definitely wouldn't hope for a peer-to-peer-based AI system to detect and adjust my new port layout :) . However I'm probably missing some details here... Is this something a user can live without?

tomek-szczesny commented 4 years ago

A drop down menu is cool, but as I pointed out later, it might be necessary to adjust settings per controller (for example clocks). So a menu with toggles won't be enough. The options tree I outlined has three levels. It could be flattened down to two levels and thus let us use drop down menus to configure everything, unless all options are boolean or enum, which seems to be the case for now (if we forget about nicknames for a moment). That might be a solution I'll implement and see how it works, but if it's too clunky I'll let someone more experienced make it better. :)

Secondly: How do we handle the fact that port names change. Both ALSA and JACK add some numbers to port names, which seem to be unique and change when you plug a controller to different USB port.

Weird, I can't reproduce such behavior right now.

Here on my setup, which is a Ubuntu Studio 20.x, I get the following port names for one of my controllers: Jack: a2j:ReMOTE25 [20] (capture): ReMOTE25 MIDI 1 ALSA: ReMOTE25:ReMOTE25 MIDI 1 20:0

Plugging ReMOTE to different USB ports indeed didn't change the number "20" at all. This is probably because it was the only MIDI controller present in the system, or maybe it has a serial number stored in it contrary to Launchpad, I don't know. This number seems to be an ALSA port number, by the way.

Then I added Launchpad: Launchpad Mini:Launchpad Mini MIDI 1 32:0 Then I plugged it into a port where ReMOTE was sitting before (it's hard to do the proper experiment with a cat purring on my lap) Launchpad Mini:Launchpad Mini MIDI 1 20:0 and ReMOTE in place of Launchpad (not sure if it matters): ReMOTE25:ReMOTE25 MIDI 1 32:0

The reason I find it severe is that if the user connects their setup exactly the same way as they did before, port names might change and ruin all bindings in the project. Imagine binding all channels to buttons on Launchpad again, especially when you are expected to perform live. I bet most users won't be aware they are expected to plug in their controllers in the same order and to the same USB ports. I wonder how other software handles the situation...

Anyway, what if we just do nothing internally and simply notify the user to update the setup?

Won't work, since in multi-MIDI environment, bindings depend on port names. If we decide to make them depend on something else, such as port numbers, and we let the user reassign ports to numbers to fix the problem, it won't be much different from the "nickname" idea.

On the other hand, you may be right that unless someone touches their setup they should be fine and the problem won't occur in most circumstances. I'll probably ignore this for now (and keep my controllers plugged in to my desktop, ugh), but if it proves to be a pain in the back, I think that a clever tool for manual reassigning existing ports to expected ports will do just fine - a quick solution, and not the worst to implement too, will simply overwrite all global and project bindings and that's it. Should solve all the problems, including migration to another MIDI system, AND its development can be postponed.

monocasual commented 4 years ago

a menu with toggles won't be enough

Ok now I get your point of the config menu tree. Tomorrow I'll upload a quick UI mock-up to better illustrate what I think we can do with the tools available from FLTK. Also, I have plans to make the main configuration window fully resizable: this should ease the design process.

This is probably because it was the only MIDI controller present in the system,

Ah! That could be the case, as I have only one MIDI device here at the moment.

tomek-szczesny commented 4 years ago

Cool! I'm genuinely curious about your concept. :)

Jackojc commented 1 year ago

Would definitely be nice to get this into Giada at some point because I think it would be a great tool to act as simply a MIDI instrument that I can trigger easily from sequencers. I've had trouble getting it to work so far because of existing limitations under JACK and such.