Closed asigalov61 closed 4 years ago
Update: I was able to solve the issue myself. At least through enumerating offsets and through applying chordify so I am closing this thread.
But please know that your documentation is subpar and absolutely unusable/unhelpful for newcomers, which I find very strange and frankly disappointing because you guys are the MIT and of all people, you should know that good documentation is very important.
Thank you for hearing me out and thank you @jacobtylerwalls for your indispensable help.
Hi Aleksandr, thanks for the kind word. I'm not at MIT, I'm just Some Person, so I hope you keep enjoying music21 and some day start contributing to it, too. Were you using the User's Guide to research your questions? Or the module documentation and doc strings? Were you using the search box? I'm just curious how you were working through the material so I can understand how to propose an improvement.
The User's Guide is addressing an audience so wide it includes people who have never touched Python before, which I think is awesome, but also presents challenges if you're a vicious skimmer, like me. What do you propose, a Quick Start Guide that gives breezy examples for most use cases and assumes knowledge of Python? The use case of parsing attributes like pitch and duration and offset from a stream and writing to disk using a custom text scheme and then reading those elements back from disk after a vector transform and creating a new stream from scratch--so far, that hasn't been a typical use for music21, but if we hear about lots of folks doing it, maybe part of a Quick Start Guide could demo a couple approaches, one like yours, and another writing information to .csv as someone wrote about on the mailing list recently. Ultimately this is what the mailing list is for, hear from folks trying similar things.
You appear to want to create a monophonic stream of notes, i.e. a melody series of chords. So you need to tell music21 to put the chords in order in time, so that it doesn't try to build chords or overlapping voices or parts, all of which it can do. I think really you just want this:
for noteOrChord in yourStuff:
yourStream.append(noteOrChord)
.append()
is discussed in ch. 4, which is pretty early--have to address Note
objects beforehand after all--and if you're hitting the search bar first, I find most things in music21 are called what you expect, so you can also hit the module documentation by searching on the term append
in the search box or in your code editor, because that's how you would create an ordered list of things in Python anyway.
Anything more specific to suggest besides saying "the examples are subpar/old-style"? Honestly asking.
Yes, thank you for a detailed response Jacob. I need to rest after my hackathon but I promise that I will answer your questions and also look into your code suggestions.
Alex
From: Jacob Walls notifications@github.com Sent: Wednesday, October 21, 2020 2:11 PM To: cuthbertLab/music21 music21@noreply.github.com Cc: Alex asigalov61@hotmail.com; State change state_change@noreply.github.com Subject: Re: [cuthbertLab/music21] Offset vs. duration.quarterLength and how to use it properly? (#647)
Hi Aleksandr, thanks for the kind word. I'm not at MIT, I'm just Some Person, so I hope you keep enjoying music21 and some day start contributing to it, too. Were you using the User's Guide to research your questions? Or the module documentation and doc strings? Were you using the search box? I'm just curious how you were working through the material so I can understand how to propose an improvement.
The User's Guide is addressing an audience so wide it includes people who have never touched Python before, which I think is awesome, but also presents challenges if you're a vicious skimmer, like me. What do you propose, a Quick Start Guide that gives breezy examples for most use cases and assumes knowledge of Python? The use case of parsing attributes like pitch and duration and offset from a stream and writing to disk using a custom text scheme and then reading those elements back from disk after an NLP transform and creating a new stream from scratch--so far, that hasn't been a typical use for music21, but if we hear about lots of folks doing it, maybe part of a Quick Start Guide could demo a couple approaches, one like yours, and another writing information to .csv as someone wrote about on the mailing list recently. Ultimately this is what the mailing list is for, you can find other people doing NLP there, ideally.
You appear to want to create a monophonic stream of notes, i.e. a melody. So you need to tell music21 to put the notes in order in time, so that it doesn't try to build chords or overlapping voices or parts, all of which it can do. I think really you just want this:
for note in yourNotes: yourStream.append(note)
.append() is discussed in ch. 4, which is pretty early--have to address Note objects beforehand after all--and if you're hitting the search bar first, I find most things in music21 are called what you expect, so you can also hit the module documentation by searching on the term append in the search box or in your code editor, because that's how you would create an ordered list of things in Python anyway.
Anything more specific to suggest besides saying "the examples are subpar/old-style"? Honestly asking.
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHubhttps://github.com/cuthbertLab/music21/issues/647#issuecomment-713879667, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ANNXLIYMEHN7YL5LGMUFFYDSL5E6VANCNFSM4S2DTCAA.
Hey Jacob,
Responding as promised...
First of all, if you are not MIT, you should be because MIT can learn from you how to be welcoming and understanding of newcomers/people who want to learn and help. 🙂
To answer your questions:
I mostly used Google/StackOverflow/your full manual + everything else here and there. I learn when I need to and I do not really have a specific approach here. Anything goes when it comes to learning IMHO.
I had to rely on the manual of'course cuz Google does not know much about such software for obvious reasons and due to the very same problems with the docs.
I would highly suggest to at the very least create a mini-manual for noobs/users because that at the very least it will help to prevent un-needed questions/misunderstanding.
Here is a good example I can highly recommend. The guys did a great job with it and that is what I would consider a good solid user-manual. https://salu133445.github.io/muspy
Another great example is StackOverflow +- community options because they really nailed it in terms of explaining the code/providing clear and concise examples.
I also made several other suggestions to Mr. Cuthbert so you guys should work together and help change this rather disappointing drawback of Music21.
I also highly suggest doing it ASAP because it is not only much needed but it may become a rather serious necessity due to the simple fact that the Music AI field has matured and it is about to explode IMHO. The reason why I say that is that Music21 would be a tremendous asset once that happens. You have gold on your hands so don't throw it away.
Thank you for the info about the list and user-groups. I simply did not have time to check it out/was not aware of it so I will do it as soon as I have free time. This is precisely why I had to disturb you guys cuz I am incredibly busy and I highly value my time, so I need quick and clear answers/help, not suggestions to go somewhere else. I hope you understand my situ.
I actually would be more than happy to donate my colab in some form to you guys so that you can make it into a nice example in the docs + I can create you mini-colab/code examples if you wish. In fact, a nice Google/Jupiter/Kaggle Notebooks set would make an even better intro guide. It is always best to teach by example rather than what you have currently.
In regard to what I am doing/trying to do, I will respond later if you do not mind as I do not have more free time atm. But I will as I did now.
Sincerely,
Alex
From: Jacob Walls notifications@github.com Sent: Wednesday, October 21, 2020 2:11 PM To: cuthbertLab/music21 music21@noreply.github.com Cc: Alex asigalov61@hotmail.com; State change state_change@noreply.github.com Subject: Re: [cuthbertLab/music21] Offset vs. duration.quarterLength and how to use it properly? (#647)
Hi Aleksandr, thanks for the kind word. I'm not at MIT, I'm just Some Person, so I hope you keep enjoying music21 and some day start contributing to it, too. Were you using the User's Guide to research your questions? Or the module documentation and doc strings? Were you using the search box? I'm just curious how you were working through the material so I can understand how to propose an improvement.
The User's Guide is addressing an audience so wide it includes people who have never touched Python before, which I think is awesome, but also presents challenges if you're a vicious skimmer, like me. What do you propose, a Quick Start Guide that gives breezy examples for most use cases and assumes knowledge of Python? The use case of parsing attributes like pitch and duration and offset from a stream and writing to disk using a custom text scheme and then reading those elements back from disk after an NLP transform and creating a new stream from scratch--so far, that hasn't been a typical use for music21, but if we hear about lots of folks doing it, maybe part of a Quick Start Guide could demo a couple approaches, one like yours, and another writing information to .csv as someone wrote about on the mailing list recently. Ultimately this is what the mailing list is for, you can find other people doing NLP there, ideally.
You appear to want to create a monophonic stream of notes, i.e. a melody. So you need to tell music21 to put the notes in order in time, so that it doesn't try to build chords or overlapping voices or parts, all of which it can do. I think really you just want this:
for note in yourNotes: yourStream.append(note)
.append() is discussed in ch. 4, which is pretty early--have to address Note objects beforehand after all--and if you're hitting the search bar first, I find most things in music21 are called what you expect, so you can also hit the module documentation by searching on the term append in the search box or in your code editor, because that's how you would create an ordered list of things in Python anyway.
Anything more specific to suggest besides saying "the examples are subpar/old-style"? Honestly asking.
— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHubhttps://github.com/cuthbertLab/music21/issues/647#issuecomment-713879667, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ANNXLIYMEHN7YL5LGMUFFYDSL5E6VANCNFSM4S2DTCAA.
Hey guys,
Another noob question here for you:
I am having difficulty creating a proper stream again.
I pass duration to my notes and chords, but I do not understand if I have to enumerate offset or durations and my stream has all elements in one offset/time point. Here is debug:
As you can see its all 0000
What am I doing wrong and how it fix it.
Thank you so much.
{0.0}
{0.0} <music21.chord.Chord A4 F#4 D4>
{0.0}
{0.0}
{0.0} <music21.chord.Chord D4 F#4 A4>
{0.0}
{0.0}
{0.0} <music21.chord.Chord F#4 G#3 A4>
{0.0}
{0.0} <music21.chord.Chord F#4 A2>
{0.0}
{0.0} <music21.chord.Chord D4 F#4 A4>