Closed RVC69 closed 3 years ago
It is intentional that rests are not highlighted during playback: they are not heard, so it is confusing to see them highlighted during playback. How is that "very annoying"?
It is really annoying if you have long silences , for example during several measures, during that time the cursor could not be updated and stay on the last note played. During that time you are unable to know where exactly you are in the playback.
There are two issues here:
Issue (1) should definitely be fixed, and I think this is independent of the rest highlighting issue. The problem is that note highlighting is not turned off until another note is played. In the above example, there is no new note until 12 measures later, causing the first G4 to remain highlighted after it has finished sounding.
So I am changing the title of the issue and switching it to an "enhancement" issue for subissue (1) 😉
Providing timestamps for rests so that you can highlight them in your implementation using verovio is a slightly different issue. Perhaps an option could be added to verovio so that rests would be included in the output timemap. Adding rests to the current timemap would also indirectly fix subissue (1) in the current implementation, since the rest highlighting would also trigger the unhighlighting of notes that are followed by rests.
If you want to debate subissue (2) further:
If you listen to that music without looking at the score, do you also get very annoyed? There is actually a piece written like that for that specific purpose (I can't remember who wrote it, maybe it was a student composer at my university), where there is a 45 second pause in the composition between two notes. I can only imagine your response to John Cage's 4'33"...
The example you give is otherwise a synthetic example. Is that the only instrument in the score? Why would someone waste their time listening to silence so they can watch the rests animate? If they are concerned whether or not the webpage is broken, they could instead look at a playback button which should be highlighted to indicate the the audio/MIDI is currently playing. I can not remember the specifics of the original implementation, but maybe there was technological reasons why rests were not included, since the animation is probably triggered by MIDI events, and rests are not encoded in MIDI files.
However, I have implemented similar systems that highlight music notation with audio recordings, and I found it very annoying × 2 to see the rest animating like the notes. Highlighting the rests in the linked example would look quite garish and amateurish in my opinion. In particular I do not want my eye drawn to the parts that are not playing, which highlighting rests would do.
Typically the example you give would be the part in a larger ensemble, where not all of the instruments play all of the time, and there would be other parts playing while this one is resting. In that case there should be other notes in other parts that will keep your interest.
In such a part, there is usually a single multi-bar rest:
This multibar rest would not be overly improved by highlighting it for the duration of 12 measures.
So I am still bemused as to why you would be very annoyed at not seeing a lot of rests animate, other than in the music editor (such as MuseScore) that you are familiar with, they do. But in MuseScore, neither the notes or the rests highlight, and instead there is a blue box that highlights each note. When there is a full-measure rest, the blue box progresses every beat rather than every note. Finale is somewhat similar, but more annoying because the playback cursor moves continuously, but jerks around a lot since spatial layout is not linear between different rhythms, and particularly at non-sounding events such as barlines. Sibelius is visual between the two, the is a thin vertical bar showing the playback region. This bar jumps to the note attack of the composite rhythm, so I like it better than the Finale continuous motion method. Sibelius also increments the playback icon by beat within a full-measure rest like MuseScore.
A better reason for highlighting rests would have been something like this: For music-minus-one situations, it would be useful for the soloist to keep track of where the accompaniment is so that they can play the G4 in the above score at the correct time (although the soloist should know the music well enough to be figuring that out without the rests highlighting, since that is the main activity of percussionst in an orchestra). Or another possible answer would be: because I want to match the behavior of music editor X.
MEI data for score:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://relaxng.org/ns/structure/1.0"?>
<?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://purl.oclc.org/dsdl/schematron"?>
<mei xmlns="http://www.music-encoding.org/ns/mei" meiversion="4.0.0">
<meiHead>
<fileDesc>
<titleStmt>
<title />
</titleStmt>
<pubStmt />
</fileDesc>
<encodingDesc>
<appInfo>
<application isodate="2020-08-29T08:34:19" version="3.0.0-dev-3022956-dirty">
<name>Verovio</name>
<p>Transcoded from Humdrum</p>
</application>
</appInfo>
</encodingDesc>
<workList>
<work>
<title />
</work>
</workList>
</meiHead>
<music>
<body>
<mdiv xml:id="mdiv-0000001819422741">
<score xml:id="score-0000001640459867">
<scoreDef xml:id="scoredef-0000000174201910" midi.bpm="400">
<staffGrp xml:id="staffgrp-0000000782515699">
<staffDef xml:id="staffdef-0000000772538095" n="1" lines="5">
<clef xml:id="clef-L2F1" shape="G" line="2" />
<meterSig xml:id="metersig-L3F1" count="4" unit="4" />
</staffDef>
</staffGrp>
</scoreDef>
<section xml:id="section-L1F1">
<measure xml:id="measure-L1" n="1">
<staff xml:id="staff-0000000162160802" n="1">
<layer xml:id="layer-L1F1N1" n="1">
<note xml:id="note-L5F1" dur="4" oct="4" pname="g" accid.ges="n" />
<rest xml:id="rest-L6F1" dur="4" />
<rest xml:id="rest-L7F1" dur="2" />
</layer>
</staff>
</measure>
<measure xml:id="measure-L8">
<staff xml:id="staff-L8F1N1" n="1">
<layer xml:id="layer-L8F1N1" n="1">
<multiRest xml:id="multirest-0000001247089652" num="12" />
</layer>
</staff>
</measure>
<measure xml:id="measure-L32">
<staff xml:id="staff-L32F1N1" n="1">
<layer xml:id="layer-L32F1N1" n="1">
<note xml:id="note-L33F1" dur="4" oct="4" pname="g" accid.ges="n" />
<rest xml:id="rest-L34F1" dur="4" />
<rest xml:id="rest-L35F1" dur="2" />
</layer>
</staff>
</measure>
</section>
</score>
</mdiv>
</body>
</music>
</mei>
As an aside, there are a few other MIDI playback related bugs: (a) there needs to be added silence to the end of the score (maybe about three seconds) so that the reverb of the last note has time to die out. This might be fixed by either adding rests after the music (which does not currently solve the problem), or tweaking WildWestMIDI to extend the sound buffer at the end of playback. And (b) when re-playing the music there is a little glitch before the start fo the music. This glitch is probably the left-over reverb sound from the end of the music (so these two issues may be related), and it would be good to otherwise flush out the playback buffer of this residue before playing again (I search for where this was happening a few years ago and could not find it).
Here is an example highlighting the problem (clipped chords are more noticeable than single notes):
MEI data:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://relaxng.org/ns/structure/1.0"?>
<?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://purl.oclc.org/dsdl/schematron"?>
<mei xmlns="http://www.music-encoding.org/ns/mei" meiversion="4.0.0">
<meiHead>
<fileDesc>
<titleStmt>
<title />
</titleStmt>
<pubStmt />
</fileDesc>
<encodingDesc>
<appInfo>
<application isodate="2020-08-29T10:21:26" version="3.0.0-dev-3022956-dirty">
<name>Verovio</name>
<p>Transcoded from Humdrum</p>
</application>
</appInfo>
</encodingDesc>
<workList>
<work>
<title />
</work>
</workList>
</meiHead>
<music>
<body>
<mdiv xml:id="mdiv-0000000265963941">
<score xml:id="score-0000001134252315">
<scoreDef xml:id="scoredef-0000000841710346">
<staffGrp xml:id="staffgrp-0000001105366982">
<staffDef xml:id="staffdef-0000000416950172" n="1" lines="5">
<clef xml:id="clef-0000000617678912" shape="G" line="2" />
<meterSig xml:id="metersig-L2F1" count="4" unit="4" />
</staffDef>
</staffGrp>
</scoreDef>
<section xml:id="section-L1F1">
<measure xml:id="measure-L1">
<staff xml:id="staff-0000001738488427" n="1">
<layer xml:id="layer-L1F1N1" n="1">
<rest xml:id="rest-L3F1" dur="8" />
<chord xml:id="chord-L4F1" dur="8">
<note xml:id="note-L4F1S1" oct="3" pname="g" accid.ges="n" />
<note xml:id="note-L4F1S2" oct="3" pname="b" accid.ges="n" />
<note xml:id="note-L4F1S3" oct="4" pname="d" accid.ges="n" />
<artic xml:id="artic-L4F1" artic="stacc" />
</chord>
<rest xml:id="rest-L5F1" dur="8" />
<chord xml:id="chord-L6F1" dur="8">
<note xml:id="note-L6F1S1" oct="3" pname="a" accid.ges="n" />
<note xml:id="note-L6F1S2" oct="4" pname="c" accid.ges="n" />
<note xml:id="note-L6F1S3" oct="4" pname="e" accid.ges="n" />
<artic xml:id="artic-L6F1" artic="stacc" />
</chord>
<rest xml:id="rest-L7F1" dur="8" />
<chord xml:id="chord-L8F1" dur="8">
<note xml:id="note-L8F1S1" oct="3" pname="b" accid.ges="n" />
<note xml:id="note-L8F1S2" oct="4" pname="d" accid.ges="n" />
<note xml:id="note-L8F1S3" oct="4" pname="f" accid.ges="n" />
<artic xml:id="artic-L8F1" artic="stacc" />
</chord>
<rest xml:id="rest-L9F1" dur="8" />
<chord xml:id="chord-L10F1" dur="8">
<note xml:id="note-L10F1S1" oct="4" pname="c" accid.ges="n" />
<note xml:id="note-L10F1S2" oct="4" pname="e" accid.ges="n" />
<note xml:id="note-L10F1S3" oct="4" pname="g" accid.ges="n" />
<artic xml:id="artic-L10F1" artic="stacc" />
</chord>
<rest xml:id="rest-L11F1" dur="1" />
</layer>
</staff>
</measure>
</section>
</score>
</mdiv>
</body>
</music>
</mei>
I think the two issues are linked,
If you turn off the highlight on rests, you will have a static view until the next notes. If the silence is long this is a problem. For example California Dreaming in solo, 16 silence measures between notes leading to a static view for a while.
Problem is even worse with rests at the beginning of the score, it's impossible to know when the first note will be played.
I know Musescore and Finale each of them have own manner to handle advance in silence. I think what is expected of a playback software is to show at least the correct position on score both on notes and rests.
I agree multirests are a problem, but multirests are mainly dedicated to paper score to limit printing costs which is not the case on electronic scores nowaday. This is the same for repetitions, coda, segno... which are efficient on paper scores not on electronic scores.
Even Tempo exists in the renderToTimemap() function
Tempos are not played notes but are rendered even if they are useless for cursor advance.
I think the correct behavior would be, with an option or not, to render the rests in the timemap when parsing to midi. On the element id "note" or "rest" the user could decide to highlight or not the rests depending on use or habbit.
I have seen in the past some midi files where rests are not notes off but notes with 0 velocity, generating events for updating advance.
I notice that there are no hyphens in the lyrics of your music:
Is that due to the data being imported from MusicXML (in which case there could be an error in the MusicXML importer). Or is this notation generated directly from MEI (in which case there is an encoding error in the data). Or is this raw OMR data from a scan with lots of dust: there is a staccato on the quarter note tied to the whole note: I would find that much more annoying.
The timemap contains both note-ons and note-offs, so your example is misleading since the next entry will contain the note off. Here is a full example (HTML used to generate it is given at the bottom of this message):
The timemap contains both note-on and note-off events; therefore, timestamps for rests are not required to solve subissue (1), so it is not linked to subissue (2). This should be an easy fix, because probably the unhighlighter is only checking for on
list in the map, and when it finds them, it then check for the off
list. Instead, the timemap handler should always check the off
list even if there is no on
list.
However, if the MIDI player is triggering callbacks for note-ons only, this will also have to be fixed. To handle highlighting correctly, it also needs to generate callbacks for note-offs as well as note-ons. If it cannot do that, then it also would not be able to highlight rests in a non-hacky way. The best hacky way would be to insert MIDI note 0 with velocity 1 into the MIDI files (similar to what you are proposing for encoding rests in MIDI files).
It would be very reasonable to add rest on/off events to the timemap. It should be easy to distinguish rests from notes, since they have IDs that start with note-
and rest=
in standard ID labeling practices (but not necessarily always true). If rests were added to the timemap, I would prefer that they either be added through an option (that is probably the best thing to do), or automatically suppress ids in the current system that start with rest-
when highlighting.
Even Tempo exists in the renderToTimemap() function
The purpose of including the tempo is to allow the tsatmp values to be recalculated if the tempo of the exported MIDI file changes (there are often tempo controls on MIDI players, so including the tempo setting would allow recalculation of the timings). N.B.: tstamp
is the time in milliseconds from the start of the MIDI file (not MEI @tstamp
values which are more analogous to timemap qstamp
which are the time from the beginning of the music in quarter notes).
Problem is even worse with rests at the beginning of the score, it's impossible to know when the first note will be played.
The question is why do you care to know when the first note (after the long rests) will be played? If you are singing along with other music, presumably that music would be in the score as well, and you can look at that music while waiting to sing.
I agree multirests are a problem, but multirests are mainly dedicated to paper score to limit printing costs which is not the case on electronic scores nowaday. This is the same for repetitions, coda, segno... which are efficient on paper scores not on electronic scores.
That is an interesting viewpoint, but it would be a very minority viewpoint. Professional orchestral players would complain greatly if they had to look at a long endless stream of rests in their part. Of course if on paper, but also there would be a lot of wasted real-estate on digital screens. Space is still limited by screen sizes, which are smaller than standard music part sizes, so while verbocity of notation is increasing, it will not ever go as far as you are implying. In addition, repeats have a functional aspect: performances can be shorted by "taking the second ending". This functionality cannot be totally replaced by digital scores, but can be adjusted by adding/removing measures for the repeats in an expanded score, which is not a common features of scores yet. If it were, there would be three modes: standard score view, full-repeats view, and no-repeats view.
I have seen in the past some midi files where rests are not notes off but notes with 0 velocity, generating events for updating advance.
There are two methods of note offs in MIDI files: one method is to use the note-off message commands (0x80
), and the other is to set the note velocity to 0 for note-ons (0x90
command set). Are you meaning the second method, or do you mean that another dummy note-off is added to the MIDI file which does not pair with a note on? That seems like it will work in most cases, although for two instruments sharing the same channel, there would be bugs in the case were a fake note-off for one note could prematurely turn off the note in the other part.
I agree multirests are a problem, but multirests are mainly dedicated to paper score to limit printing costs which is not the case on electronic scores nowaday. This is the same for repetitions, coda, segno... which are efficient on paper scores not on electronic scores.
There is currently a rendering bug with multi-bar rests in the MIDI converter of verovio: They are given zero time. Although this is a bug, it is a handy feature, since you do not have to wait for that amount of rests to continue hearing the notes...
HTML for timemap example:
<html>
<head>
<title>timemap test</title>
</head>
<body style="background:skyblue;">
<div id="example"></div>
<div style="background:white; white-space:pre; padding:10px; width:200px;"
id="timemap"></div>
<script>
var vrvToolkit;
var Module = {
onRuntimeInitialized: function() {
vrvToolkit = new verovio.toolkit();
prepareExample("example");
prepareTimemap("timemap");
}
};
</script>
<script src="https://verovio-script.humdrum.org/scripts/verovio-toolkit-wasm.js"></script>
<script>
function prepareExample(targetid) {
var delement = document.querySelector("#humdrum-data");
var data = delement.textContent;
var options = {
scale: 60,
adjustPageHeight: 1,
pageWidth: 1200,
breaks: "encoded"
}
var svg = vrvToolkit.renderData(data, options);
var element = document.querySelector("#" + targetid);
element.innerHTML = svg;
}
function prepareTimemap(targetid) {
var timemap = vrvToolkit.renderToTimemap();
var text = JSON.stringify(timemap, null, 3);
var element = document.querySelector("#" + targetid);
element.innerHTML = text;
}
</script>
<script type="text/x-humdrum" id="humdrum-data">**kern
*M4/4
4g
4r
2r
=
1r
=
4r
2r
4g
==
*-
</script>
</body>
</html>
This score was xml imported and I have'nt care about formatting.
I see two main utilization of your wonderful software:
I am more interested in the playback. As non professional, what I find the most difficult is to keep tempo on long silences, that's my interest regarding rests.
To your question, The question is why do you care to know when the first note (after the long rests) will be played? I play wind instruments, when I start playback, clicking on button, I need at least one measure to be ready to play on sync with the player. So I add one or two rests measures at the beginning of scores. I find advance on these measures important so I can catch the right tempo without a metronom.
I usually play with a touch sensitive tablet on my stand ( I play wind instruments). But the screen is small, so I keep the full tracks on the midi file but only my part of the score for viewing. In that case I can hear the advance of others instruments, but I cannot see advance on my part if rests. (But I receive the full time events from midi player.)
From a programming point of view having a full TimeMap would be helpful to navigate between elements notes, rests and parents measures. As example, you could decide to display the current played measure number, which is only possible today if you are on notes but not on rests .
I haven't test yet, but without a full TimeMap navigating on the score by clicking on element could only be done on notes not on rests. That's mean you could not start playing at the beginnning of a measure beginning by rests but only at the first note. Even if you consider it is intentional, I find not starting at measure beginning is a strange behaviour .
Regarding multirests without midi events, may be an easy solution would be to blink rest element with a js setInterval at beat or measure tempo.
In conclusion a full TimeMap with rests as an option, would be wonderful and will offer lot of possibilities.
Remains unclear. Closing for now
Run into this when trying to overlay a Verovio-rendered svg with a graphical pointer. I think this sould be solved, as pauses are as vital as notes for keeping orientation in the score.
An options parameter, something like renderToTimemap(options: {includePauses: true})
, could keep the current behaviour if null, but still make it possible to use the pause information when needed.
If anyone's interested, I've created a fork and added timeMap info for rests as well as notes: https://github.com/cambiata/verovio/tree/extended-timemap
Great! Are you willing to make a PR? There are a few things to adjust. I can make a review in the PR.
You will need to sign the CLA unless you have contributed and done it before. Thanks
renderToTimemap retrieves notes and tempo, but unfortunately not the rests which are just silent notes.
As these values are missing in the timemap, getElementsAtTime() will not retrieve these rests. These rests could not be highlighted during playing.
This is very annoying for long silents as notes will stay highlighted until next note.
Thanks for correction.