djipco / webmidi

Tame the Web MIDI API. Send and receive MIDI messages with ease. Control instruments with user-friendly functions (playNote, sendPitchBend, etc.). React to MIDI input with simple event listeners (noteon, pitchbend, controlchange, etc.).
Apache License 2.0
1.54k stars 116 forks source link

note objects in WebMIdi #20

Closed cfry closed 5 years ago

cfry commented 7 years ago

Here's my suggestion of how we could make note objects without being disruptive to existing WebMidi. Each method that takes an "options" argument, would simply take the other args as fields in the "options" argument. Here's the beginning of playNote that could accomodate this with transforming a call like: input1.playNote("C#4", 1, {duration: 1000}) to input1.playNote({pitch: "C#4", channel: 1, duration: 1000})

enabling: var my_note = {pitch: "C#4", channel: 1, duration: 1000} input1.playNote(my_note)

Via:

Output.prototype.playNote = function(note, channel, options) { //same params as now if (typeof(note) == "object"){ if(channel) { throw error } else { channel = note.channel options = note note = options.pitch } } ... rest of existing implementation of playNote

Observe that the MIDI convention of using the word "note" to mean "pitch" is a disaster, so I don't recommend using it. So I'd prefer: Output.prototype.playNote = function(pitch, channel, options) { if (typeof(pitch) == "object"){ if(channel) { throw error } else { channel = pitch.channel options = pitch pitch = options.pitch } }

In any case, using this style to extend the methods that take "options" as an argument gives you the capacity to make and pass around notes as one literal JS object, and maintains backwards compatibility.

I have a class called Note. If we had the above def for playNote, I think I could just pass an instance of my Note class into playNote, and it would work. Furthermore, anyone could build their own Note-like class and as long as they used the same names and value types & ranges as your code, (which I did!) they can add whatever extra fields and methods to their class and things should work out.

djipco commented 7 years ago

This is an interesting idea.

I'm not sure it would be wise to merge the note and options parameter, though. The options object allows options that do not necessarily belong in the Note object. A good example is the time property used for scheduling.

Now, which methods would require the change? In the Output object, I can see the following:

In the Input object, the only method to update would be addListener() but it would have to be updated in the context of the following events:

The Note object should probably have the following properties:

A question that quickly arises is: how do we create a Note? Do we specify a note number and it plots the name and octave or the other way around, or perhaps both are possible?

Am I overlooking anything obvious?

cfry commented 7 years ago

First I bit about myself. I'm 64. I started programming at MIT in the late 1970s. I wrote A LOT of music software, (in Lisp) but not signal processing or digital audio, rather the symbolic stuff. A very large program I wrote was called "Flavors Band" on the Lisp Machine at MIT AI. It composed (and improvised) musical "scores" based on direction (some very high level, some pretty low level) from a human. I worried about data structure a lot.

Bu let me say also that you've got some great experience too, not just with MIDI but your robot, IoT things. I will talk to you about all of that some day I hope, as the midi software I'm writing is on my Robot IDE. I bet you could help with that too!

Mostly what I've done in the past few decades is general purpose language design, IDE design and implementation, and some Common Sense reasoning and Nat Lang stuff, including Nat Lang for programming. So I'm all about language.

So far, WebMidi has been great for me. Saved me a lot of work, and I especially appreciate the work you put into the documentation. Rare and more than good!

OK onto the issues: "The options object allows options that do not necessarily belong in the Note object. A good example is the time property used for scheduling."

I'm not too concerned about extra junk in a note object. A user might want to throw in their own attributes of who knows what. The important thing about a Note should be that it has certain fields with certain known types & ranges that WebMidi core functions depend on. If it has other stuff, so what?

Now I'm confused about "time" not being in a note. Don't you want to schedule when to play the note? Doesn't playNote already take advantage of time and not play it until the right time? Any decent note data structure should have the start-time of the note within it, and naming it "time" is ok by me. If time==undefined or null, then assume the start time is NOW, otherwise, schedule it.

Now on to other properties. I don't like redundancy in a data structure, not so much because it takes up more memory, but because if you change one, then you have to change the others or else you'll have an inconsistent note. But if you remove the redundancy, you get rid of the possibility of redundancy. So I'd say, for example, you should have pitch (midi note number) but NOT octave or name ("C#4") as they are redundant. In my Note class I have just a pitch integer. I have a method that extracts the octave, and a method that derives the note name from the pitch (I also have a method for "pitch class number" (0 thru 11) and a method for pitch_class_name ("C", "C#" etc.) and more fancy stuff.

Same thing for "normalize" and "raw". I'm not sure which is right but I'm pretty confident that having only one of them is right. You know MIDI far more than me. I like the tightness of an integer 0 -> 127 (if that's what RAW is) but logically I also like 0 to 1 float. Whichever one is picked, have a method to get the other one. I guess I'd say if the output is always midi, and we want that to be quick and not lossy from floating point conversions, go with the 0 -> 127, but I'm not so confident of this as LOGICALLY I'd rather work with 0 to 1.

Now on to methods. Since I have an actual Note class, its easy to stick a method on it like "get_octave". You have the output class that you stick playnote on. You'll have to forgive my ignorance here but to my mind this notion of "channels" and "output device" is confusing. I want one "instrument" designation that indicates the timbre of the note. How that maps to channels and "output" I don't understand. You have notes, each is played on a particular instrument. I assume the instrument is polyphonic. So, in my mind, the right way to think is: I have a note that has associated with it an instrument. my_note.play() plays that note on whatever the "instrument" property of the note says to. (we can default it like everything else of course.) Its ok with me if there's some "map" that says instrument 12 , or "sax" is channel 5, output device 6 or whatever, but that's independent of the data in the note.

Here's one case where this modularity of instruments makes sense. Say I have a brass band and I want to score a chord played by a bone a sax and a trumpet. I should just be able to indicate those three notes say C3bone E3sax G3tpt and win. OR maybe I want C3sax E3bone G3sax nice local changes to the relevant data structures.

In our "map" if we want to remap all uses of "sax" to "flute" in one fell swoop, that's fine with me. What becomes important in the note structure is just that C3 and G3 are played by the same instrument and E3 is played by a different one.

BUT, you're the MIDI expert here and I don't want to spend the rest of my life on music, just a little part. So I'm throwing out some cards on the table, and you can rearrange to suit the underlying implementation, which I don't understand.

PS: I have one other class in my code, a 'phrase' which has, essentially an array of notes. Most of the interesting processes will be on phrases, not notes. But they look surprisingly similar ie they both have times and durations and defaults for channels, etc.

I considered Flavors Band to, in its core, be a phrase processing library. (Before MIT I went to Berklee, across the river. Berklee is the MIT of Jazz. )

PPS: your question how do we create a note? I like the core method accepting the midi note integer. But sure, we ought to be able to make such a number from pitch class and octave. So for instance new Note({pitch: pitchclass_and_octave_to_num("C#", 4), duration: 1 }

On Mon, Aug 14, 2017 at 9:28 PM, Jean-Philippe Côté < notifications@github.com> wrote:

This is an interesting idea.

I'm not sure it would be wise to merge the note and options parameter, though. The options object allows options that do not necessarily belong in the Note object. A good example is the time property used for scheduling.

Now, which methods would require the change? In the Output object, I can see the following:

  • playNote()
  • sendKeyAftertouch()
  • stopNote()

In the Input object, the only method to update would be addListener() but it would have to be updated in the context of the following events:

  • noteoff
  • noteon

The Note object should probably have the following properties:

  • MIDI note number
  • name
  • octave
  • duration
  • velocity (normalized and raw)
  • release velocity (normalized and raw)

A question that quickly arises is: how do we create a Note? Do we specify a note number and it plots the name and octave or the other way around, or perhaps both are possible?

Am I overlooking anything obvious?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/cotejp/webmidi/issues/20#issuecomment-322352916, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITff0Zy76QrqaV0-l40wlQaq4ms9Uiks5sYPRCgaJpZM4O3D4c .

djipco commented 6 years ago

It seems I missed your response when you originally posted it. Sorry. Since your response is long, let me break it down to smaller pieces.

Any decent note data structure should have the start-time of the note within it, and naming it "time" is ok by me.

I'm not sure I agree here. A note and the time you play it at are two different things. You might want to play the same note at a different time. That's why playNote() requests a note and (optionally) a time. Honestly, I'm not even convinced it should have a duration property. To me, start time and duration/stop time probably belong in a higher level object.

djipco commented 6 years ago

So I'd say, for example, you should have pitch (midi note number) but NOT octave or name ("C#4")

One is obviously going to be derived from the other. I'm not going to store both. The question is, which one? It may seem trivial but you have to take into consideration that octaves are not interpreted in the same way on different devices (see issue https://github.com/cotejp/webmidi/issues/18).

If I store the MIDI note number, it will yield different notes on different systems. Then again, this will be mitigated by the ability to "transpose" planned for v2.1.

djipco commented 6 years ago

You'll have to forgive my ignorance here but to my mind this notion of "channels" and "output device" is confusing. I want one "instrument" designation that indicates the timbre of the note. How that maps to channels and "output" I don't understand.

A MIDI "device" (synth, drum machine, software, lighting gear, etc.) is a piece of software or hardware that can speak MIDI. Devices have up to 16 channels of MIDI communication. MIDI devices can exchange messages on any or all of those channels. What they do with the received messages is up to them. So, MIDI itself does not have a concept of "instrument". MIDI is a communication protocol. In that sense an Instrument object is out of scope for WebMidi.js.

Note: some JavaScript libraries will let you build instruments (such as Tone.js) which in turn can speak MIDI.

So, in my mind, the right way to think is: I have a note that has associated with it an instrument. my_note.play() plays that note on whatever the "instrument" property of the note says to.

That's a very interesting (and unusual) way to look at it. It's as if the note knows which instrument it should go to to get played. I expect most people to look a it the other way around: tell this instrument (combination of device and channel) to play that note. Your note-centric approach is intriguing but I don't see it as part of WebMidi.js itself.

I do understand part of your frustration though. You want to play instruments not send messages through channels. In MIDI, an instrument is what would listen to messages at the end of a specific channel on a specific device. I'm just not sure WebMidi.js should be handling that.

You could very well create an Instrument() object that accepts outputDevice and channel parameters. This object would have a playNote() method that would simply call outputDevice.playNote(note, channel).

cfry commented 6 years ago

Interesting. So please do a user experience test for me. Take a really common nursery rhyme melody, say 3 blind mice. Now rearrange the start times of your "notes" (which don't have start times) and play the melody to pretty much anyone. Now just on the off chance they aren't able to name the melody, play the original and see if they can name that.

Now let's compare, for our control study, a feature of both your and my definition of a note, "its channel" (I prefer the name "Instrument", but then I also think of such other rediculous standards of middle C being called 60, standardizing the frequency of pitches to "A" instead of "C" or better, having the starting pitch class being named 'A" instead of C. but I digress.) In any case try the same user experience test as above but instead of changing each notes start time, change their channel/instrument and ask the participants if they can name the tune (either way). Now perhaps its just me, but with my feeble musical skills, I can identify the melody regardless of its instrument but for some reason can't when the start times are changed, so for me, the start time is a more important aspect of note than its channel/instrument, which we both agree should be in a note.

I will grant you that the duration is of lesser importance than start time or pitch. I bet most of the subjects in the above experiments will be able to identify melodies if you just change duration. But due to the real time nature of midi, perhaps we should say it doesn't even have notes, it only has "events", two of which make a note. I think the midi community also uses the word "note" to mean pitch, to which I ask: So what name do you use for an actual note?

So I think we're down to the common ground of agreeing that a "note" has a pitch, a channel/instrument, a volume, and that's about it. Well if duration isn't a part of a note, what is it a part of? Even conventional music notation (which is one of the worst languages ever not designed) has duration captured in the shape of the note's glyph and its start time, captured in its horizontal ordering with respect to the proceeding notes.

I don't really care so much about terminology (well actually I do) but more important is the ability to understand the the complex data of music and to be able to manipulate it in semantically meaningful ways (well ok its ART but I think being a programmer, you know what I mean). The primary conceptual to managing complexity is to bundle features together and work at a higher level. To the extent that modularity helps you think most easily about the data and the modularity "plays well" with other "modules" of that same modularity, then its good.

What I like about your software is that it allows me to ignore details you had to sweat over that I don't really care about. (so long as they work:-) I'm just trying to do the same thing for others that want to move up another notch from the details.

Most of my professors at Berklee would disagree with me. Most of my professors were also lousy teachers and they didn't know why. I do.

On Fri, Dec 22, 2017 at 7:07 PM, Jean-Philippe Côté < notifications@github.com> wrote:

It seems I missed your response when you originally posted it. Sorry. Since your response is long, let me break it down to smaller pieces.

Any decent note data structure should have the start-time of the note within it, and naming it "time" is ok by me.

I'm not sure I agree here. A note and the time you play it at are two different things. You might want to play the same note at a different time. That's why playNote() requests a note and (optionally) a time. Honestly, I'm not even convinced it should have a duration property. To me, start time and duration/stop time probably belong in a higher level object.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/cotejp/webmidi/issues/20#issuecomment-353693342, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITfcJrgnT9EU9D9mSO6JVOh4U5Jammks5tDEQ5gaJpZM4O3D4c .

djipco commented 6 years ago

Wow! I wasn't expecting that. ;-) I'm not saying timing is more/less important than instrument/timbre. In fact, you seem to assume I would add a reference to the instrument inside the Note object but I wouldn't.

I think our misunderstanding stems from the fact that we are not talking about the same thing. I'm talking about a communication protocol and you're talking about music notation/semantics.

In fact, I'm no longer sure I should venture into the Note business inside WebMidi.js. Perhaps it would be wiser to build a music-oriented library to do that. This higher-level library could have Phrase objects playing Note objects via Instrument objects. This library would use WebMidi.js for inter-device communications in a way that would completely hide the channels, messages and devices you're not too fond of...

I think this would make more sense than to try and tack musical features onto a library which, in the end, is really about data communication.

cfry commented 6 years ago

On Fri, Dec 22, 2017 at 8:06 PM, Jean-Philippe Côté < notifications@github.com> wrote:

You'll have to forgive my ignorance here but to my mind this notion of "channels" and "output device" is confusing. I want one "instrument" designation that indicates the timbre of the note. How that maps to channels and "output" I don't understand.

A MIDI "device" (synth, drum machine, software, lighting gear, etc.) is a piece of software or hardware that can speak MIDI. Devices have up to 16 channels of MIDI communication. MIDI devices can exchange messages on any or all of those channels. What they do with the received messages is up to them. So, MIDI itself does not have a concept of "instrument". MIDI is a communication protocol. In that sense an Instrument object is out of scope for WebMidi.js.

​I appreciate you explaing this concept that I didn't understand before. I agree that Midi is a communication protocol. But there are at least thousands of those. Probably the most important aspect of a communication protocol is ​what is it good at communicating? What was it designed to communicate? What semantics do communicators communicate with it? These all usually have the same answer. If MIDI is just "a communication protocol" they could have called in "cp247" or maybe "real time protocol". But they didn't and the first word of the achrornym is???

It is common for communication protocols to be used for things other than the protocol was originally intended for, or what it is most used as. One could well argue that English is intended for communication between people, but then we have siri, communication between person and machine. Its great that MIDI can be used to control lights etc. I would say for ease of understanding its quite often easiest to learn to use a protocol for its original intended purpose, then branch out (or invent) new purposes.

Because of your explanation, its clear that device and channel don't have direct analogs for MIDI's original intended purpose. Still this is unfortunate because musicians (the main body of users of MIDI) already understand instruments, players, orchestras all of which are all understood by musicians. Let me remind you though that "groups" of things are sometimes referred to as their most prominent part.

This is a knowledge representation no-no, but often its better than some weird group name you've never heard of before. Example: "cat" usually refers to a household cat. The biological term for the category its in is probably "feline" or something like that, ie a much less common word. But when most normal people want to refer to the biological category that "cat" is in they'd probably say "cats" ie essentially the same as its most common member. And if they used the genus name, most people would wonder "what the heck is that?' So I claim it would behoove us to find terms for the various parts of our Musical communcation protocol (MCP better than MIDI? not much) If we call a device a group of "instruments", well that pretty much works but what term do we use for a "group of instruments" that isn't necessarily all the instruments in a song? There's not a common name for the "sections" of an orchestra. I thought it was "sections" but Google shows me both "group" and "family". I like family better because group in the context of music means the whole band, not just a part of it. So a better name for device would be "family" I guess.

Now in an orchestra, a family might mean "strings" whereas in MIDI we could, at least in theory, have a family composed of instruments that were, say violin and trumpet . These 2 instruments are not in the same orchestra family, but the more important concept here is a bunch (but not not nessesarily sonically similar) instruments in a band/orchestra. So I say a device (so damn generic that its meaningless) would have better been named "family" and we'd call all the devices used in a song, the "orchestra". Then I wouldn't have had to waste your time by asking the question "what's a channel or device".

Note: some JavaScript libraries will let you build instruments (such as Tone.js) which in turn can speak MIDI.

So, in my mind, the right way to think is: I have a note that has associated with it an instrument. my_note.play() plays that note on whatever the "instrument" property of the note says to.

That's a very interesting (and unusual) way to look at it. It's as if the note knows which instrument it should go to to get played. I expect most people to look a it the other way around: tell this instrument (combination of device and channel) to play that note. Your note-centric approach is intriguing but I don't see it as part of WebMidi.js itself.

​Well I did say my Berklee professors disagreed with me. But I have three points in my defense.

  1. In an orchestra, a conductor wants to get a bunch of notes played. He doesn't REALLY care who plays them (as long as they're good) but he does care about the instrument its played on. ​So he points his baton at the first violinist and ​says "You there, play the first note of the symphony now" He is indicating the instrument, the start time, and by handing the score to all the musicans before the concert, many other details.

I think the USERS of midi are most like conductors. Because they can "play multiple instruments" as a conductor (kind of) can. An accomplished MIDI performer can play multiple "instruments" simultaneously which no conventional musician can. The conductor can also "set in motion" a whole sequence of notes, which of course, an accomplished MIDI user can too.

  1. The 2nd most significant users of midi are composers (conductor and composer are the two most important people in a song played at a concert, right?) (ok lets call an arranger a special kind of composer and the "sound engineer" another "instrument player" and maybe a few others, but you get the generalization.)

Now from a rather deep experience in writing programs that compose, I developed a language to do this in. And the 2 most important data structures are note and phrase. A phrase is composed of notes. A song is a phrase composed of nested phrases or notes. A key flexibility is that you can have a phrase of notes that don't have an instrument associated with each note, or at least one that is not used much (perhaps a default)

Then you could declare that all the notes of this phrase are played by trombone. You could also say "All you notes in this song that usually use trombone, I now want you to play tuba, (or piano)." OR all the notes that start between beat 22 and beat 34 that originally have instrument 'oboe", play on guitar. (then of course "compile" those instructions into a data structure I could quickly feed to a real time synthesizer.

I also wanted to play melodies where each note was played by a different instrument. I didn't want to violate the melody modularity and cut it up into different parts. I just wanted to play each note on potentially a different instrument. This is most easily done by associating an instrument with a note, but does not restrict declaring one instrument at a higher level, as described above.

  1. Usually in programming an app, you have a certain model related to the real world in mind before you code in ernest. All accompished programmers have the experience that their first, naive architecture made the coding and ultimate utility of the software worse than some "somewhat less natural" model that isn't really all that hard to learn and just plain worth the initial extra effort. It may be that my definition of note doesn't fit what most musicians or even MIDI users think. But I estimate that if they don't have my model/tools that support the model. They'll spend more time and mental effort later on getting less of what they want.

It was a general purpose language, you get the idea. After a fair amount of experience (it was a big program written on MIT AI lab's special AI hardware, "the lisp machine", that took me a few years) lead me to this architecture that I deem is the most flexibly yet easily understood modularity for the things most musicians would want to do and be easy, as well as possible to do things that most musicians hadn't even thought of but were still musically relevant, and not prohibitively hard.

I don't think that thinking that "the blue light in the corner" as an instrument or "all the blue lights" as a family, or "all the lights" as an orchestra' or "turning on the blue light at a start time and turning if off at the end of a duration" as a note or "a set of such 'light' notes" as a phrase is a big stretch for most people but if it is, well they probably wouldn't be good lighting desigers or lighting operators anyway.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/cotejp/webmidi/issues/20#issuecomment-353697611, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITffBUIzD8UxBvVt5aHcviiGBB0tj9ks5tDFImgaJpZM4O3D4c .

cfry commented 6 years ago

On Fri, Dec 22, 2017 at 9:03 PM, Jean-Philippe Côté < notifications@github.com> wrote:

Wow! I wasn't expecting that. ;-) I'm not saying timing is more/less important than instrument/timbre. In fact, you seem to assume I would add a reference to the instrument inside the Note object but I wouldn't.

​I claim you already have and its called "channel". See my prev message.​

I think our misunderstanding stems from the fact that we are not talking about the same thing. I'm talking about a communication protocol and you're talking about music notation/semantics.

​Your right, but see prev message.​

In fact, I'm no longer sure I should venture into the Note business inside WebMidi.js. Perhaps it would be wiser to build a music-oriented library to do that. This higher-level library could have Phrase objects playing Note objects via Instrument objects. This library would use WebMidi.js for inter-device communications in a way that would completely hide the channels, messages and devices you're not too fond of...

I think this would make more sense than to try and tack musical features onto a library

​I've done that. If you like, I'll show you the doc or the video of the webinar I did on the system.​

which, in the end, is really about data communication.

​This is kinda true, but see my previous message. Regardless of all of this. WebMidi.js is a very useful tool.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/cotejp/webmidi/issues/20#issuecomment-353700551, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITfVLDJlzIMmYp7qPDK2m9NS678Rexks5tDF9ogaJpZM4O3D4c .

djipco commented 6 years ago

I also wanted to play melodies where each note was played by a different instrument. I didn't want to violate the melody modularity and cut it up into different parts. I just wanted to play each note on potentially a different instrument. This is most easily done by associating an instrument with a note, but does not restrict declaring one instrument at a higher level, as described above.

This is very interesting. I never though of it this way. I still believe this is for another, higher-level library, but something is becoming quite clear in my mind: I need to expose a channels property on input and output devices. This property would contain an array of InputChannel or OutputChannel objects. The OutputChannel object would be very close to an "instrument" object:

var synth = WebMidi.getOutputByName("Axiom").channels[1];
synth.playNote("C3");

With this architecture, you could pass a Note object to Channel.playNote() and it would play on that channel but you could also pass a Note object to Output.playNote() and it would play it according to the channel defined in the note.

I need to think about this...

cfry commented 6 years ago

Sounds excellent. An "instrument" knows how to play a note and you can tell it to do so, but also a note can contain the "instrument" that you want it to be directed to when you tell the note to "play". In the case of trumpet.play(note({instrument: "sax"}) the trumpet would "over rule" the sax. as in "we're playing that sax note on the trumpet"

In my higher level software, I make a note object with, for example: new Note("C")

and that object has a play method which I spell "start" new Note("C").start() I originally called it "play" but my app can mix play note instructions along with play phrase instructions, along with "move robot instructions" and just about anything else you can do in JS. So I wanted to call all those "initiate instructions" by the same name, ie we can even effectly give a method a start time, and "call" it, then it delays until its start time (and all the other events preceding it.) Hmm, now that I think about it, this sounds a little more like your conception of MIDI, ie not so much dedicated to music but a protocol for performing events.

When I call the start method on a note that has a 0 start time, it plays immediately, else it delays to its start time, so by modifying the note structure you affect when its played.

I think you wouldn't like this, BUT, you can give a note a 0 start time and then it starts a soon as it is "played". This incorporates your style notes as a subset of the functionality of my style notes.

Combining a "device" and "channel" into an "instrument" gets rid of the, not so useful orchestra "family" that I talked about prev message. I was primarily looking for a musical term that mapped onto midi, not something that was generally useful musically. Ok, family is useful musically but not useful enough to have one other than just sticking a bunch of instruments in an array and assigning to variable var string_family = [violin, guitar] so not a great idea to duplicate JS's functionality inside a music library here.

On Sat, Dec 23, 2017 at 9:50 AM, Jean-Philippe Côté < notifications@github.com> wrote:

I also wanted to play melodies where each note was played by a different instrument. I didn't want to violate the melody modularity and cut it up into different parts. I just wanted to play each note on potentially a different instrument. This is most easily done by associating an instrument with a note, but does not restrict declaring one instrument at a higher level, as described above.

This is very interesting. I never though of it this way. I still believe this is for another, higher-level library, but something is becoming quite clear in my mind: I need to expose a channels property on input and output devices. This property would contain an array of InputChannel or OutputChannel objects. The OutputChannel object would be very close to an "instrument" object:

var synth = WebMidi.getOutputByName("Axiom").channels[1]; synth.playNote("C3");

With this architecture, you could pass a Note object to Channel.playNote() and it would play on that channel but you could also pass a Note object to Output.playNote() and it would play it according to the channel defined in the note.

I need to think about this...

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/cotejp/webmidi/issues/20#issuecomment-353729934, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITfeyPbkwRCclIJYp3KtRUsb37mdxhks5tDRM_gaJpZM4O3D4c .

djipco commented 5 years ago

I'm closing this but I added a reference to it in the Enhancements section of the wiki.

cfry commented 5 years ago

"A question that quickly arises is: how do we create a Note? Do we specify a note number and it plots the name and octave or the other way around, or perhaps both are possible?

Am I overlooking anything obvious?"

First, get the terminology straight.

I like using "note" for "the whole thing".

I like using pitch for the combination

of the octave and the

pitch_class

into one integer ie 0 to 120 or so.

pitch_class isn't a great name for

the 0 to 11 of the chromatic scale

but its one that others use unambiguously.

I like being able to declare, when making a note

either its

pitch_class and octave

OR its pitch.

Providing BOTH is a good idea.

pitch_class should default to 0 ("C")

octave should default to whatever

you're using for the octave starting with

middle C.

pitch itself should default to middle C.

Underlying representation is fine

to just use "pitch", but make it easy

to ask a note for its octave or its pitch_class

with utility methods.

On Fri, Apr 5, 2019 at 12:23 PM Jean-Philippe Côté notifications@github.com wrote:

Closed #20 https://github.com/djipco/webmidi/issues/20.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/djipco/webmidi/issues/20#event-2256311440, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITffJlbMeOYS665G6M8BGm8ySTdWf4ks5vd3iagaJpZM4O3D4c .

cfry commented 3 years ago

Please help me understand something: In WebMidi 2.2.0, WebMidi.guessNoteNumber("C4") returns 60 In WebMidi 2.5.3, the same call returns 72. Why did you make this change? I'm aware that musicians have been confused about octaves forever. Does Midi use 60 to mean Middle C? Should Middle C be designated as C4, C5, or something else? Is there consensus out there? Thanks so much!

On Fri, Apr 5, 2019 at 12:46 PM Christopher Fry @.***> wrote:

"A question that quickly arises is: how do we create a Note? Do we specify a note number and it plots the name and octave or the other way around, or perhaps both are possible?

Am I overlooking anything obvious?"

First, get the terminology straight.

I like using "note" for "the whole thing".

I like using pitch for the combination

of the octave and the

pitch_class

into one integer ie 0 to 120 or so.

pitch_class isn't a great name for

the 0 to 11 of the chromatic scale

but its one that others use unambiguously.

I like being able to declare, when making a note

either its

pitch_class and octave

OR its pitch.

Providing BOTH is a good idea.

pitch_class should default to 0 ("C")

octave should default to whatever

you're using for the octave starting with

middle C.

pitch itself should default to middle C.

Underlying representation is fine

to just use "pitch", but make it easy

to ask a note for its octave or its pitch_class

with utility methods.

On Fri, Apr 5, 2019 at 12:23 PM Jean-Philippe Côté < @.***> wrote:

Closed #20 https://github.com/djipco/webmidi/issues/20.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/djipco/webmidi/issues/20#event-2256311440, or mute the thread https://github.com/notifications/unsubscribe-auth/ABITffJlbMeOYS665G6M8BGm8ySTdWf4ks5vd3iagaJpZM4O3D4c .

djipco commented 3 years ago

The reasoning behind this change was explained in [Issue #42]. If you prefer to have a different middle C, you can change the default behaviour with WebMidi.octaveOffset.

In version 3.x, you can also tweak middle C on a channel-basis or an InputChannel and OutputChannel basis by using the property of the same name.

cfry commented 3 years ago

Thanks for your super speedy reply. I have used WebMidi for years in my open source robotics programming environment. Here's some info I think you'll find valuable and hopefully will help others:


First, I made an error in my previous post: in one version of my software, I have my package.json says WebMidi 2.2.0 but package-lock says WebMidi 2.5.2 In my later version, my package.json says WebMidi 2.5.3 and package-lock says WebMidi 2.5.3. I think package-lock is the REAL version I'm using, but to verify, I tried finding a "version number" in WebMidi and couldn't, so I suggest having: WebMidi.version = "2.5.3" in every release (but you have to remember to change it!) Having a "Release Checklist" saves me from forgetting such things.


Second, From https://www.npmjs.com/package/webmidi "The API for WebMidi.js is fully documented and I take pride in maintaining good API documentation. If you spot an error ..." The above para has 2 actual links in it, both to https://webmidijs.org/docs/v2.5.2/index.html but when browsed, returns 404. (perhaps another item on your release checklist?)


Third: On a whim I browsed https://webmidijs.org/docs/v2.5.3/index.html and that worked. The doc on WebMidi.octaveOffset is quite good once I figured out to click on the properties tab. But the doc on WebMidi.guessNoteNumber is poor. No examples (on any of the methods on that page) and no expected returned values. I suggest you add something like: Example: WebMidi.guessNoteNumber("C4") => 60 which is middle C.


Fourth: I found relevant differences between WebMidi 2.5.2 and 2.5.3 WebMidi.guessNoteNumber("C4") => 60 in 2.5.2 (good) WebMidi.guessNoteNumber("C4") => 72 in 2.5.3 (bad) ...... Similar discrepancies in WebMidi.noteNameToNumber("C4") which I'm guessing is the real source of the problem. ....... WebMidi.octaveOffset => 0 in 2.5.2 WebMidi.octaveOffset => undefined in 2.5.3 But I'm running that 2.5.2 in Electron, whereas 2.5.3 in straight browser code. I don't see why that would make a difference as I haven't seen similar discrepancies elsewhere (including WebMidi) but just giving you a heads up. I'm running on Mac OS 11.6, Chrome browser latest.

in 2.5.3, I tried: WebMidi.octaveOffset = 0 WebMidi.octaveOffset = 1 WebMidi.octaveOffset = -1 For all of them, WebMidi.noteNameToNumber("C4") => 72


Fifth From issue #42 you wrote: "The MIDI Tuning Standard states that note number 69 should be tuned at 440Hz by default, which would also make middle C (60) to be C4. Given that, I would tend to agree to go with C4. However, this is a breaking change. What are people's views on this?" My view is you did the right thing! C4 == 60 == middle C == 261Hz


I appreciate the openness of this discussion, and the years of dedication you've devoted to this great project, helping musicians everywhere.

I live in Boston and attended Berklee School of Music in the 1970's. My teachers and classmates were confused by octaves back then too!

On Mon, Oct 25, 2021 at 8:16 AM Jean-Philippe Côté @.***> wrote:

The reasoning behind this change was explained in [Issue #42 https://github.com/djipco/webmidi/issues/42]. If you prefer to have a different middle C, you can change the default behaviour with WebMidi.octaveOffset https://webmidijs.org/docs/v2.5.3/classes/WebMidi.html#property_octaveOffset .

In version 3.x, you can also tweak middle C on a channel-basis or an InputChannel and OutputChannel basis by using the property of the same name.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/djipco/webmidi/issues/20#issuecomment-950865358, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJBG7P2L2P4RFP5FCSTRLDUIVDBNANCNFSM4DW4HYOA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

djipco commented 3 years ago

Thank you for taking the time to provide all this feedback. This is very valuable to me and the community. Here are some answers and comments of my own...

I have used WebMidi for years in my open source robotics programming environment.

I'm curious to know what environment that is.

I tried finding a "version number" in WebMidi and couldn't, so I suggest having: WebMidi.version = "2.5.3" in every release

That's an excellent idea! I will implement it in v3. By the way, you can always look at the library's source file and the version is always there at the top.

The above para has 2 actual links in it, both to https://webmidijs.org/docs/v2.5.2/index.html but when browsed, returns 404.

I fixed the links. NPM fetches the README of the project from Github and somehow caches it. I guess the fixed links will kick in at some point.

But the doc on WebMidi.guessNoteNumber() is poor.

I'm currently hard at work on v3. This version will come with a brand new website. This website will have an API section and a Documentation section. This will allow for more extensive examples in the documentation will leaving the API clean as a reference should probably be. Stay tuned for a major update before the holidays!

I found relevant differences between WebMidi 2.5.2 and 2.5.3...

I just did a quick test and calling WebMidi.guessNoteNumber("C4") returns the same value in v2.5.2 and 2.5.3. Something else must be at play here.

My view is you did the right thing! C4 == 60 == middle C == 261Hz

Excellent!

Again, thank you so much for the feedback. The upcoming v3 will be a huge leap forward in terms of features, documentation, community, etc. You can sign up for the email newsletter or follow the project on its brand new Twitter account to be informed of its release.

cfry commented 3 years ago

Thanks again for your fast reply. We had a little email a few years ago when I first started using WebMidi. Are you in Montreal? I'm in Boston.

On Tue, Oct 26, 2021 at 8:45 AM Jean-Philippe Côté @.***> wrote:

Thank you for taking the time to provide all this feedback. This is very valuable to me

Likewise!

and the community. Here are some answers and comments of my own...

I have used WebMidi for years in my open source robotics programming environment.

I'm curious to know what environment that is.

Dexter Development Environment: A javascript IDE with extensions for programming the Dexter robot (and a few other things like midi!) www.hdrobotic.com is the company. Its a big Electron app. You can read the doc about the Midi related stuff at: https://www.hdrobotic.com/software lower right of page, navigate the doc hierarchy: Reference Manual/IO/Sound/Music with Midi This doc and more examples in the software itself at: https://github.com/cfry/dde/releases/tag/v3.8.0 Let me know how interested you really are. You might wonder how robots and music are related. err, long story but we can have a chat sometime. Possibly a little code you could steal (all open source) for some higher level stuff but in any case, I'll help.

If you download go to Insert menu/Music/Phrase Examples where one of them is: new Phrase("A2 B2") p1 .arpeggio([1, 3, 5], "G", [1/2, 1/4, 1/4]) .merge( p1.transpose([1+7, 3+7, 5+7], "G" ) //the chords A minor, B minor .set_property("velocity", 0.4)) //make chords quieter to not overwhelm bass line .repeat(16) Hey,. I just discovered a bug in my documentation on the exact location of that menu, which I've just fixed in the upcoming version so thanks! (Its "Insert menu/Music" )

I tried finding a "version number" in WebMidi and couldn't, so I suggest

having: WebMidi.version = "2.5.3" in every release

That's an excellent idea! I will implement it in v3. By the way, you can always look at the library's source file and the version is always there at the top.

OK, something really whacky is going on here. In my node_modules/webmidi/src/webmidi.js the top of the file is:

(function(scope) {

"use strict";

/**

searching for "2." ONLY finds 2.0.0 on line 113

I looked in node_modules/webmidi/package.json and its top 4 lines are:

{ "name": "webmidi", "version": "2.5.3", "description": "WebMidi.js helps

The above para has 2 actual links in it, both to

https://webmidijs.org/docs/v2.5.2/index.html but when browsed, returns 404.

I fixed the links. NPM fetches the README https://github.com/djipco/webmidi of the project from Github and somehow caches it. I guess the fixed links will kick in at some point.

That was fast.

But the doc on WebMidi.guessNoteNumber() is poor.

I'm currently hard at work on v3. This version will come with a brand new website. This website will have an API section and a Documentation section. This will allow for more extensive examples in the documentation will leaving the API clean as a reference should probably be. Stay tuned for a major update before the holidays!

Sounds good.

I found relevant differences between WebMidi 2.5.2 and 2.5.3...

I just did a quick test and calling WebMidi.guessNoteNumber("C4") returns the same value in v2.5.2 and 2.5.3. Something else must be at play here.

Yes indeed. so I stepped through, (in the version with package.json version of 2.5.3) WebMidi.guessNoteNumber("C4") It calls: this.noteNameToNumber(input); (and returns its result, which is 72) Here's the source code of noteNameToNumber and the relevant intermediate values from the stepping:

WebMidi.prototype.noteNameToNumber = function(name) {

if (typeof name !== "string") { name = ''; }

var matches = name.match(/([CDEFGAB])(#{0,2}|b{0,2})(-?\d+)/i);
if(!matches) { throw new RangeError("Invalid note name."); }

var semitones = wm._semitones[matches[1].toUpperCase()]; => 0
var octave = parseInt(matches[3]);  => 4
var result = ((octave + 2) * 12) + semitones; => 72, so looks like that
 "+ 2" should really be + 1

if (matches[2].toLowerCase().indexOf("b") > -1) {  action not executed
  result -= matches[2].length;
} else if (matches[2].toLowerCase().indexOf("#") > -1) { action not

executed result += matches[2].length; }

if (semitones < 0 || octave < -2 || octave > 8 || result < 0 || result

127) { action not executed throw new RangeError("Invalid note name or note outside valid range."); }

return result; => 72

}; I am using rollup and ES6 modules for my new version, but my old version that returns 60 is using Electron and CommonJS modules. It looks like somehow I managed to get the wrong version of webmidi.js??? Since I just did regular old npm install, can't see how that failed but .... software can work in mysterious ways. UPDATE: I figured it out: rollup was sticking 2 versions of WebMidi into bundle.js. One had the correct def for nodeNameToNumber: with crucial line: var result = ((octave + 1 - Math.floor(wm.octaveOffset)) * 12) + semitones; The other had an incorrect line with "octave + 2" and I have no idea where it was getting that bad version of WebMidi from.

I had in node_modules/webmidi, a webmidi.min.js and a webmidisomething.ts file. I deleted them both. I uninstallled and reinstalled WebMidi, deleted package-lock.json. booted chrome, booted my Mac, used the very specific: import WebMidi from "../../node_modules/webmidi/src/webmidi.js" got rid of a redundant (but shouldn't have mattered) import of WebMidi and eventually got the ONE RIGHT version of webmidi in my bundle.js So now everything working just like in my previous version of Dexter Development Environment. PHEW. took me until 2AM tonight.

My view is you did the right thing! C4 == 60 == middle C == 261Hz

Excellent!

Again, thank you so much for the feedback. The upcoming v3 will be a huge leap forward in terms of features, documentation, community, etc. You can sign up for the email newsletter https://mailchi.mp/eeffe50651bd/webmidijs-newsletter or follow the project on its brand new Twitter account https://twitter.com/webmidijs to be informed of its release.

Great. I just signed up for your newsletter.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/djipco/webmidi/issues/20#issuecomment-951902808, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJBG7JYLGCKQJ7ATURX24TUI2PHJANCNFSM4DW4HYOA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.