Closed Patochun closed 4 years ago
You're right Tempo is largely useless, other than to display on the animation... Or to set a lamp intensity, etc. (lamp gets brighter as Tempo increases).
I don't mind the order, it depends on what will be the foremost filter. if we filter by channel (Track) first them set this as the first field maybe?
I also used the Time Signature to show bars in an animation and "downbeat", so outputting this might be a good idea. Maybe add/compute a pulse every bar from the Time Sig... Thinking out loud again.
Fixed field number is not important as we are mainly going to filter by the first two only Channel=
and Type=
.
Ok for tempo. It can be keeped. I did not see the possible utility.
We can also save the track information:
For each track so they can be auto-labelled directly from the MIDI file.
MIDI file order is Channel(Track) first, so maybe that is more intuitive for programers/users.
I rewind just few messages and now I said, no, the channel can be in fact by having separate objects channels. A list of objects "channel/track". Each object may contains some informations regarding "channel/track", his title, and obviously all msg MIDI for notes, etc..
Do you mean a separate node for each channel, sorry I didn't understand the last message..
Socket output of MIDI File node is one track choosen by user. The default is 0 or 1 ? If he want manage other track then he must use loop. Other output is the number of channel ?
Or it's all tracks in structured list of tracks ?
Everything is possible. For my quick POC I use the second choice. Omar seem to be less confusing. Waiting to discuss all together.
Socket output of MIDI File node is one track choosen by user. The default is 0 or 1 ?
I believe the first music track with notes in it is channel/track 2, channel/track 1 is just the info, like initial tempo, time-sig, etc.
Oh I think I numbered differently starting from 0
Example:
I think that is the pulse rate in channel/track 0... I may have to look that up.
Yes it is, here is my code to get the pulse value from a csv MIDI file:
with open(path) as f1:
for line in f1:
in_l = [elt.strip() for elt in line.split(',')]
if (in_l[2] == 'Header'):
# Get Pulse variable.
pulse = int(in_l[5])
dataD['Pulse'] = pulse
This gives you bpm:
bpm = float( round( (60000000 / int(tempo)), 3) )
This gave me frames from MIDI timings, you can adapt this to give you seconds:
frame = round(int(in_l[1]) * (60 * fps) / (bpm * pulse),2)
int(in_l[1])
was the timing from the MIDI line, fps
was blend file frames per second.
Just another thought, have you thought about the fact that some DAWs / Keyboards use middle C as C3 and some as C4? I remember I had to account for that and make it an option on my nodes, the note is just a number, not a name in the MIDI file.
Thx Clock, I do that before for my other project. And I just commited changes to manage the first node.
With the use of channel :
And the use of Track instead :
@Clockmender The Separation node you are describing can be done, even manually using a loop with reassignment. So I wouldn't worry about this case.
@Patochun Correct me if I am wrong, but isn't the channel a property of a MIDI Note? Why would we do the segmentation you are describing? The node should always return a list of MIDI Tracks, the MIDI notes will store the channel as an integer ID. A node will be provided to filter notes based on channel ID to drive different animations analogous to the different instruments that would have been played on those channels.
Yes it is a property of a MIDI note, and it will be the case in the vast majority of cases. But some software exports by not using this property to separate. He leaves it to the same value and uses the tracks to differentiate the instruments. This is the case of the Reason Studio Propellerhead MIDI Export (now Reason Studio). I know this issue because I work with this software. This is not to impose another segmentation but to propose it in case the MIDI file is done in this way. We can propose to have this boolean hidden at the start and when choosing the MIDI file to detect the configuration N tracks for 1 channel to display this boolean?
The MIDI file will be of Type 1, with one MIDI track for each track in the Reason sequencer. Since the Reason sequencer doesn’t use MIDI channels as such, all tracks will be set to MIDI channel 1.
As a fellow Reason user, I can confirm segregation by Track is needed, Reason does not place much store in MIDI Channel...
A human readable version for you to look at to see how it is set out, if that helps at all. This was exported from Reason.
If Reason exports them as tracks, then why should we add this option? Are you trying to simulate Reason's method of doing things?
@OmarEmaraDev I don't understand your last comment, it's simply that we need to be able to segregate by Tracks (as Patrick has done) as well as MIDI channels... I believe there is a maximum of 32 MIDI channels, whereas Tracks is unlimited. I don't think Reason is the only DAW that does this either...
@Clockmender Segregation by channel is fine and we should provide an intuitive way to do it. However, I don't think it should be an option of that node. We can provide specialized nodes for that, like a Filter node.
I don't mind how it is done, perhaps you could agree that with Patrick, I don't want to get involved in any coding myself.
@OmarEmaraDev, I think like @Clockmender that Reason is not the only DAW to export to a coded MIDI file as well => Track versus channels. It is just impossible to separate into various instruments if the channel information is not used. On the other hand, as I show in the attached documentation, it is indicated that the tracks are used instead in this case. We can do walk away at first, although that is not a problem in terms of code (already coded in fact), but keep in mind that it will probably be claimed soon enough.
I am the type of coder / tester and I like to materialize as and when. So here quickly the state of what I prepared:
Nothing definite of course. Does it start to look like what we discussed together ?
@Patochun I like that! so, 15 tracks, you can separate for instrument and note and it's Dave Brubeck's "Take 5" I also did that in Reason some time ago, loved playing the piano bit on my TMK88.
It is just impossible to separate into various instruments if the channel information is not used.
The channel will always be stored in MIDI Notes, so they can always be separated. What I fail to understand is why you would join all of them to channels? There is probably a very good reason why notes are in a certain track. Moreover, it seems counterintuitive to return "channels" from the node, where "channels" are not structures but merely just properties. The node output is even named "Tracks".
@Patochun This doesn't look very similar to what we discussed, but I don't want to rush and judge something that isn't definite. So continue for now until you get something that is more rigorous.
Just enlight me on work axes. The goal is to converge to slowly (evently) but surely to the solution.
In less I am greatly mistaken (possible), the first number is the one you want to segregate by:
2, 0, Start_track
2, 0, Title_t, "Piano Ch1"
2, 0, Instrument_name_t, "Piano Ch1"
2, 0, MIDI_port, 2
2, 1000, Note_on_c, 0, 74, 80
2, 1000, Note_on_c, 0, 86, 80
2, 7680, Note_off_c, 0, 74, 0
2, 7680, Note_off_c, 0, 86, 0
2, 7680, Note_on_c, 0, 76, 79
2, 7680, Note_on_c, 0, 88, 79
2, 15360, Note_off_c, 0, 76, 0
2, 15360, Note_off_c, 0, 88, 0
2 in this case is the first piano track.
Another example taken from the internet, not produced with Reason:
1, 8711999, Tempo, 1333333
1, 8712000, Tempo, 1500000
1, 8742719, Tempo, 1500000
1, 8742720, Tempo, 1714285
1, 8773439, Tempo, 1714285
1, 8773440, Tempo, 2000000
1, 8788799, Tempo, 2000000
1, 8788800, Tempo, 1333333
1, 8789760, Time_signature, 6, 2, 24, 8
1, 8881920, End_track
2, 0, Start_track
2, 0, Title_t, "Manual_Ch1"
2, 0, Instrument_name_t, "Manual_Ch1"
2, 0, MIDI_port, 10
2, 0, Control_c, 0, 64, 0
2, 3904320, Note_on_c, 0, 65, 64
2, 3904320, Note_on_c, 0, 81, 64
2, 3907980, Note_off_c, 0, 81, 0
2, 3908160, Note_on_c, 0, 77, 64
2, 3911640, Note_off_c, 0, 65, 0
2, 3911820, Note_off_c, 0, 77, 0
2, 3912000, Note_on_c, 0, 76, 64
2, 3915660, Note_off_c, 0, 76, 0
2, 3915840, Note_on_c, 0, 77, 64
2, 3919500, Note_off_c, 0, 77, 0
2, 3919680, Note_on_c, 0, 72, 64
2, 3923520, Note_on_c, 0, 77, 64
Exactly the same for the Manual (organ), incidentally, there are 186 tempo changes in this file, something to note for your development work. I segregated by this first number in all my work, for what that is worth.
The figure after the note_on_c is always 0 in both files BTW. I think I will excuse myself from this conversation for now...
Also, please note that we are making this system for users who probably never heard about MIDI before and have zero knowledge of what any of its details mean. So just because you or Alan understand what is going on, that doesn't mean your average user will.
Hello all, Ok @OmarEmaraDev, I have full time today and tomorrow, so we can build the fundations. First we need to understand ourselves much better. I'm doing my best, I think, I'll try to have a less impaired vision from my MIDI experience. Write a little about the specifications, it's always a good working basis
I was expecting the MIDI File Parser node to return a list of MIDI Tracks. Instead, I see it returning a dictionary of an implicit "Channel" structure, which is conceptually not a structure.
Node : MIDI File Parser Prop : none => Note 1 Input : Path of the file Output : List of MIDI track => Question 1
I was expecting the MIDI Track Info node to take a MIDI Track and return a list of MIDI Notes. Instead, it is taking a dictionary of the implicit channel structure and return an implicit list "Messages".
Node : MIDI Track Info Prop : Index track => Question 2 Input : List of MIDI track Output : List of MIDI Notes => Question 3
I am not sure what the MIDI Note node is doing here.
Me either in the context described above. So forget it!
Then` you are returning events, which I though was something we agreed we are not going to do.
This is in my opinion relative to our misunderstanding of what is a note and a note played approached widely in question 3. Essentially a MIDI file is an event file. I think that we have blocked on a question of semantics. It does not pose any problem to clarify this point even more if there is need
Also, please note that we are making this system for users who probably never heard about MIDI before and have zero knowledge of what any of its details mean. So just because you or Alan understand what is going on, that doesn't mean your average user will.
Apart from the fact that I think that people who have never heard of MIDI will be reluctant to go for animations from that, I fully agree to make this experience the simplest and most efficient possible. Of course. I put my experience at the service of others, it is also the meaning of free software and I am attached to it.
I do not know how much time you have for this project. But I hope we can move forward strongly these next two days.
====================================================
Note 1 = Ok, if we have separate tracks but unordered channel numbers for MIDI messages note_on and note_off, discrimination is done by separate tracks. It's just that if afterwards in the process we aggregate by channel numbers, in this case we will mix notes for the piano with notes for the drums, etc ... in any case for MIDI files created by Reason, I have not looked for all DAWs, but it seems to be a bit common.
Question 1 = Under it forms the list of tracks. A list of track objects (which I thought I did), a list containing exactly what ? and in what form ? (python or AN speaking). Do not forget that I only have a little more than a year of experience for python (even if I am a computer scientist for 28 years) and not even a month for AN. Do not hesitate to be more precise and directive, it will help me.
Question 2 = Index track is an integer. it would be nice to display the name of the track corresponding to the chosen index to help the user, what do you think ?
Question 3 = Same question as for the list of MIDI tracks, what's in it, in what form? To be more precise, what is a note in this context? I imagine that we are talking about a note played? which is very similar to a MIDI event noted in the MIDI file by the concept of MIDI message.
In the first structuring design given by you:
Midi File:
Midi Track 1 Midi Track 2 ... Midi Track:
Midi Note 1 Midi Note 2 ... Midi Note:
Note Channel Note Number Note Off Time Note Off Velocity Note On Time Note On Velocity
What is called MIDI note 1 or 2 is not an "object" conceptually notes but an "object" note that describes the event of a note played. What then comes down to:
A "Note Number" note played on the "Note Channel" at the moment "Note On Time" with the velocity "Note On Velocity" and stopped playing at the moment "Note Off Time". (We can discuss keeping the velocity value for note_off which must always be 0 and strongly resembles a constant)
Do we agree, now, that what you call a Note is actually a note played, so an event in time?
A MIDI Note is a structures that encapsulates two consecutive MIDI events on the same channel and number, a Note On and a Note Off events. Conceptually, it is a description of a sound that was played.
I didn't know the Note Off velocity was always zero, in that case, we can omit it as you suggested. Your description of the MIDI Note seems to align with mine, so I think we are on agreement.
I guess I am fine with calling it an event in time as long as we make it clear that it is not a trigger event, but rather, a gate event.
I don’t know if this helps but any DAW will export the same number of tracks as there are tracks in the DAW’s sequencer, it could be 1 or any number. It is not fixed, the first one I posted has only 1, my largest has 23, but this should not be seen as a maximum.
@Clockmender I realize that, but my question is, will Reason export a MIDI file with any number of tracks? From what I understood from you is that it will put all events of the same channel into a track. But what if there were multiple tracks, will the 0-15 tracks be the channels of the first track, the 15-31 tracks be of the channels of the second track and so on?
It will export any number of tracks, each record (line in the midi file with note-on, etc.) starts with the track number, MIDI channel is irrelevant and only occurs in the track header. So all notes in one track will start with the track number, not the channel number. It that a better description?
Many DAW’s refer to this as midi-port not midi channel, if you look at the last bit of midi I posted you can see this: port was 10, track was 2.
Perversely many tracks could share the same midi-port as can be seen by the discrepancy in Patrick’s two lists.
Also perversely is the fact that all tempo changes are in track 1, the file header, not chronologically in the midi messages, as these are only chronological per track, and the tracks are then in index order.
My head hurt :grinning:, I will have to do some more research on the topic.
Thank you for your comments and approval. I will work in this way. I have already started as usal :)
For Reason, in fact the MIDI file type is 1 and he use one track per instrument. But leave all MIDI channel to 0 everywhere.
To be more precise, I don't have a MIDI with more than 15 tracks. So I don't know how Reason deal with more. I can made some tests.
After coding that I don't think the
Node : MIDI Track Info Prop : Index track => Question 2 Input : List of MIDI track Output : List of MIDI Notes => Question 3
is usefull anymore. Because the use of AN Get List Element node do the job, what do you think?
I also put time for note_on then note_off in this order
Will we be able to go to the next step?
It is necessary at a given moment to recalculate the time in seconds to transform it into a frame via the current fps.
@Patochun We do not allow lists of lists in Animation Nodes. This only works right now because you are using Generic lists. So, no, I think the MIDI Track Info node is essential.
Not entirely sure what you mean, but the times should be fps agnostic.
@OmarEmaraDev We agree that the output of "MIDI File Parser" is a flat python list that contains objects of type MIDI Track... but I did not understand that this object could not be a list itself. So what should it be?
About times will be fps agnostic, I do not have your view on the continuation and how one will treat all these data to make an animation. I would need to know more. It could avoid bad roads.
The MIDI Track object should be an instance of a MIDITrack
class that contains a list of MIDI Notes. See the Sound
class to see how this looks in practice. Similarly, a MIDI Note will be an instance of a MIDINote
class that contains the info we have as discusses. See the SoundSequence
class to see how this looks in practice.
We can't use lists because Animation Nodes rely mostly on typed lists, that is, all elements of the same list should have the same type, which is not the case here.
What kind of animation do you have in mind?
ok, I understand better for the lists, I discover the constraint to have elements of the same type. And so every time that will not be the case the need to use an object, I'm right?
I will go to study more deeper the Sounds nodes.
For the first test animation, I was thinking of something very simple, a bar graph. Or each bar represents a note. A group of bars represents a track. And these bars evolve over the course of music.
And so every time that will not be the case the need to use an object, I'm right?
Not sure I understand what you mean. Can you elaborate?
In the AN "World" I cannot use a list containt elements of differents types as you said. The other sentence is a consequence of that. nothing more.
As an animator, I would just like to see a node that outputs via float socket, a list of floats, one for each note played, at every frame, with a second socket telling me what note those floats are, i.e first is e3, second is g2s etc. A major node lets me select a midi file and outputs a set of tracks for the first one I mentioned to access these via a standard get element node. How you get there; je m'en fous!
EDIT:
I have some routines that translate MIDI note numbers in to actual note names, for both cases of where middle C is and also some routines to translate notes into frequencies if these are any use to you, just let me know.
I commit this :
@Patochun Looks nice. Now try to create sockets for those data. Use the existing socket classes as a reference.
Yes, it is the next step. I'm in. A question : from output of MIDI File Parser, it is a list of object And in the input of MIDI Track info is only on object track. The question is, we cannot link these two nodes without get lis element ? I'm right ?
Ok I see with each socket come list socket
Yes, we can't link them directly.
With sockets :
I confess that I did not really understand how sockets worked. I took a model and adapted. But not really understood.
@OmarEmaraDev maybe we should already do a code review, so as not to diverge from the standards and correct as soon as possible what should be.
I have a bug on the 2 MIDI Track Info and MIDI Note Info nodes. It does not work when there is nothing connected in input. I can not understand why for now.
How do you see next?
@Patochun Nice progress! I think we should postpone code review for now until we get some practical examples first.
Regarding the MIDI Note Info node, why are the Time On and the Time Off outputs of type Integer, should't they be floats? Also, since we know the maximum value of the Velocity output, can we normalize it and make it a float?
Next, I want you to create two nodes:
Yes, Float. Well seen
Ok for the next nodes. I'm on my way.
Hello, I start a code for using MIDI events to drive some animation in blender.
Like this : https://www.youtube.com/watch?v=9N9v5mHrADw
Now, I think it's time to release my work for everyone. I think early to create a blender add-on but animation nodes seems to be the best choice
I work on a proof of concept for now. Animating a list of integer where each of them represent a note and her velocity value. So I limit in this POC for only one MIDI track.
I have few questions to start this :
Animation nodes is a big beauty piece of code. I'm a python dev level 4 than AN requires dev level 15... it's a journey for me.