Closed luke-slnsw closed 4 years ago
That site is awesome! Great job on it!
I tried out the full-synth.html example in the project and it is returning currentTrackMilliseconds the way that is expected.
That value is not generated all the time, though. It can't be generated until the playback options are specified because things like bpm can be overridden.
It is set by a call to visualObj.setUpAudio(options)
. This will happen internally when using any of the audio functions, but it can also be called directly by you if you are generating audio in another way.
I didn't see the error when looking at your site, though. Did you go back to beta.13?
For the setProgress - I got it to change the position in the demo before playing the piece, but where it changed position to was wrong - that's a bug that I'll take care off. I'm not sure what's different about your site than the demo. It either has to do with a difference in how it is called or the problem happens on certain ABC strings.
Are you are using timingCallbacks
to change the cursor position as the music plays? That should have called setUpAudio
. If so, there is a callback you might be interested in that detects when you are getting near the end of a line. We use that to scroll so we're not scrolling continually (that was disconcerting to us), but we scroll a line's worth with a small animation one second before the end of the line. That also has the advantage of scrolling back on repeats.
If you are still having problems getting currentTrackMilliseconds set, I can help further with more info about how you are using abcjs and perhaps a sample ABC string.
Hi Paul, thanks for the response. Glad you like the site.
thanks for the info re currentTrackMilliseconds - perhaps in my testing I was only trying before playback options had been specified - will check again next week.
Yes I switched our site back to beta.13 as we are going live on Monday.
Thanks and I'll get back to you next week :L
Hi again Paul,
We do indeed use the timingCallbacks.
To demonstrate the issue I have made two deployments of our site, one using beta.13 and the other beta.14. They both log some stuff into the console.
This one uses 13: https://dxlab-virtuoso-git-beta13.dxlab1.vercel.app/virtuoso/song/national-song-our-sailor-prince
Now when you click on a note (whether the song is playing or not) the element returned does in fact contain currentTrackMilliseconds.
Whereas in this one that uses beta14: https://dxlab-virtuoso-git-beta14.dxlab1.vercel.app/virtuoso/song/national-song-our-sailor-prince
...the same element no longer seems to contain it.
(In these deployments I also console log the event for each eventCallback so we can see the midiPitches issue I mention in: https://github.com/paulrosen/abcjs/issues/410 )
many thanks for your attention :L
I didn't get to this on beta.15, but I will look at it on beta.16. I hope to do another quick beta with small, various issues soon.
I can reproduce both of those issues, so should have a fix soon.
Can you try adding:
current[0].setUpAudio()
right after your renderAbc
call?
(I'm guessing at the variable name since your code is minified.)
Also, I see an odd thing in your click handler. The minimized code is:
var t = e.currentTrackMilliseconds && e.currentTrackMilliseconds[0];
That will return undefined.
Right now I'm just delivering a number in currentTrackMilliseconds
, but that's a bug: it should be either a number or an array of numbers if you click on something in the middle of a repeat. I'll fix that part of it. In the meantime, the currentTrackMilliseconds
is the time of the last occurrence of that note.
Hi @paulrosen
Thanks for looking at this. I'm on leave this week but will get back at it next week.
In the meantime you may be interested in the blog post we published about how our project that uses ABCjs works:
https://dxlab.sl.nsw.gov.au/blog/creating-virtuoso cheers :L
Hi @paulrosen,
OK while running beta14 I added setUpAudio()
after we run renderAbc
and MAGICALLY the currentTrackMilliseconds
returns!
many thanks for that. I did also have to change the code in the click handler as you suspected.
That is very interesting what you say about using an array of times for when the note clicked on appears inside a repeat. I'll make sure I handle both cases of it being a number or an array.
I've noticed that by running setUpAudio()
after we run renderAbc
, the midiPitches info returns to the event, and it seems to have a few more things added to it, like gap
and start
- could you explain what these mean or are used for?
I notice there is also cmd
- what other values may it contain other than 'note'?
Also as we were asking for in https://github.com/paulrosen/abcjs/issues/410 will it be possible to add the staff number into the midiPitches
object as well?
many thanks! :L PS I replied to your comment on our blog post...
That's a really interesting blog post! Your library system sounds like a great place to work. And the project is really impressive.
I'm not sure what the best way to response to some of your concerns is, but I'll post them here and maybe some of them could become individual issues:
1) "such as a metronome, accompanying drum beats" - I guess you came across the %%drum
notation. That will create a regular pattern of your choosing. I use that to create a metronome all the time.
2) "ABC notation does have a way of annotating double voicing, however our code which handles note events breaks horribly when it encounters such notation" - I assume you were trying the overlay notation: &
? Can you elaborate how that breaks because I might be able to present the note events differently.
3) "however we could find no way to adjust stem direction of individual notes in ABCjs" - I looked into this and it appears there is a directive [I:pos stem up]
and [I:pos stem auto]
that can go around individual notes. I haven't implemented it because I just became aware of it. It will be in some future release.
4) "We managed to get the ‘8va’ itself above the staff, but only as ‘text annotation’, meaning it has no effect on the note data or music." - There have been other requests for implementing 8va. I'll bump up the priority.
5) "Currently various markings in the notation that indicate adjustments in dynamics or volume such as accents, crescendos, forte, pianissimo etc, while being correctly annotated in the ABC notation, do not have any effect on the audio." - I do output that info when creating the audio. Can you elaborate on what doesn't work?
6) "Similarly with timing adjustment such as fermata (or hold) and grace notes." - this is a hard topic because the specifics of this are subjective. I was considering making notes with fermata twice as long (plus, if there are more than one voice, the fermata would have to apply to all voices or things get out of sync.) For grace notes, I've seen various implementations, like "all the gracenotes take 1/2 of the note's value". It appears to be more complicated than that because if the tempo is really slow that doesn't sound right. If I had a good algorithm for fast and slow tempos, long and short notes, and one or many gracenotes then I'll implement it.
7) "it turned out the MIDI pitches data was not tied to the staff the notes belonged to. It was just a list of MIDI data for that moment in time. But in our songs we have 2-3 staves which can be voiced by any of several instruments. Thus we need to know which staff each of the MIDI pitches belongs to, so we can send the note to the correct instrument." - I'm not sure how relevant this is with the current beta but the midi info returned both from timingCallbacks and clickListener contain the property instrument
. It doesn't have the staff, but it sounds like that gives you the instrument info.
(I meant to send the above comment a while ago - i must not have hit the submit button.)
I don't call setupAudio automatically because if a user isn't interested with audio then it is a waste of time. Plus you can override things like the tempo so I'd have to call it again anyway. I don't know, but I suspect your old code was calling my synth functions, which calls it behind the scenes and you stopped calling that.
For the return structure - that is subject to some changes still because it is evolving. "cmd" is an artifact. There are other commands earlier in the process but they are filtered out - for instance change of instrument or change of tempo. "gap" is an attempt to put the note length logic closer to the note production. At present it is to differentiate staccato, normal, and slurred notes. "start" is the position of the note in whole notes. That is, the first note will have a start of 0, and if that is a quarter note, the next note will have a start of 0.25.
Hi again @paulrosen Thanks for these very detailed responses!
Yes the Library is a great place to work - they let us play with their amazing collection and make fun things with it!
I wasn't aware of the %%drum
thing - will check it out. Thanks.
Yeah the &
does cause the issue - I replied to your comment on our blog post, which relates to this topic. But a more detailed explanation of the way we do things is:
When the page loads or something changes (tempo, notation etc) we:
abcjs
instanceabcjs.parseOnly(notation)
to get the JSON datastartChar
and endChar
numbers. The objects are some note data such as note, octave, accidental, duration etc - in the format we need to send to our audio engine, Reactronica, for playback.tune = abcjs.renderAbc()
tune[0].setUpAudio()
Then in the eventCallback
for each note event we use the arrays of startChar
s and endChar
s and look up the notes from the note list and send them to the audio engine.
Now all this parsing of JSON to make our own note list was only necessary because at the time the midiPitches
info was empty; it was a workaround. It of course meant we were re-doing a lot of work you have clearly done in ABCjs (such as keeping track of timing/note length, adjusting for triplets, flatting and sharpening of notes, key changes and so many other things) but we couldn't see any other way to get the data during the eventCallback
. And it is by no means perfect or anywhere near as comprehensive as ABCjs.
The &
issue was one thing that used to break it - I stopped it breaking, but it means that the notes after were essentially ignored (I think). Hence our work arounds of moving notes to bass clefs and so on mentioned in the blog post.
In the longer term we hope to be able to do away with our clunky function and use the midiPitches
data.
And thanks for your nifty trick for annotating the double voicing in your comment on our blog post - as I explain in the reply there, it introduces another issue for us - but we could code around it, give enough time.
I look forward to being able to use [I:pos stem up]
and [I:pos stem auto]
in the future to control stem direction of individual individual notes - that would be super useful!
Likewise with the 8va
We couldn't find any volume information in the output of abcjs.parseOnly(notation)
and so we just leave all the notes at the same volumes currently. We have noted that when we have seen midiPitches
data in newer versions of ABCjs that it contains volume
info - so as soon as we can have that data tied to each staff we should switch to using it.
Yes I agree with you entirely on that one! I read some of the discussions you have had in other GutHub issues about just how grace notes for example should work - and agree it is really very variable! I think for us if fermata-ed notes were held longer (and notes on other voices at the same time had an invisible rest after them, or maybe were held too) that would be great.
I tried to code in the grace notes in my parsing function, as they were previous not sounding at all, and I realised that they require the delaying of the main note, which I could not do very easily - so I just sound them at the same time, albeit for a shorter duration than the main note. Not ideal, but given there is no 'event' for the start of the delayed main note I have no choice. Perhaps if the gap
field of midiPitches
contains a delay we could eventually use that?
OK so the 'instrument
in midiPitches was always 0
and from what I read before it referred to the General MIDI instrument (which I thought meant it wasn't relevant for us). I've realised now after reading your comment above that we can use the %%MIDI
part of ABC to declare that instrument
for each voice.
However I can't seem to make it work. If we can get it to work, this is HUGE news. As you can see from all my comments above it will solve a lot of issues for us! Here is an example of my trying to make it work:
X:1
M:4/4
L:1/4
%%score 1 2
%%MIDI voice 1 instrument=1
%%MIDI voice 2 instrument=12
V:1 name="VOICE"
z4|E A/ c/ B G/F/|E e c2|
A G/ B/ E F|G2 z2|E A/ c/ B G/F/|
V:2 clef=treble name="PIANO"
"@-5,55 Con Spirito alla Marcia." !f![Ee] [Aa] [cc'] [B/b][A/a]|[Bb] [ee'] [c2c'2]|[Aa] [F/f][d/d'] [cc'] [Bb]|
[Aa] [CEA] z [CEA]|z [CEA] z [B,DG]|z [B,DE] z [CEc]|
Using beta14, I still see all instrument
values in midiPitches
as 0
rather than the 1
and 12
I was expecting.
I also tried:
%%MIDI voice VOICE instrument=1
%%MIDI voice PIANO instrument=12
with the same result. Am I doing something wrong? If we can get them coming though that solves it for us!
On setupAudio()
- absolutely fair enough you aren't calling it automatically. Great to know that we can, and that it fixes a bunch of our issues. We haven't even used any of your synth functions actually as I could never find any of the documentation on them (I think there was a broken link and Google was no help, surprisingly).
Thanks for explaining those fields in the midiPitches
data - the gap
field looks potentially very useful for grace notes. for example....!
Many, many thanks for your help Paul, -Luke
If you look at the return value for setupAudio
you might have all the timing info you need. I don't know if it would be easier to convert that structure to the structure expected by reactronica but I suspect it would.
I've been trying to improve the documentation and it is a bit better than it used to be. I hope you can get enough info on the synth functionality now. But you might decide reactronica is better quality anyway - I'd be curious. I haven't had a chance to look at that project yet but it is in a browser tab!
For changing the instrument, I don't support all the variations of the commands - many were created after I wrote the code and I'm not always aware of them. What should work in your example is:
X:1
M:4/4
L:1/4
%%score 1 2
V:1 name="VOICE"
%%MIDI program 1
z4|E A/ c/ B G/F/|E e c2|
A G/ B/ E F|G2 z2|E A/ c/ B G/F/|
V:2 clef=treble name="PIANO"
%%MIDI program 12
"@-5,55 Con Spirito alla Marcia." !f![Ee] [Aa] [cc'] [B/b][A/a]|[Bb] [ee'] [c2c'2]|[Aa] [F/f][d/d'] [cc'] [Bb]|
[Aa] [CEA] z [CEA]|z [CEA] z [B,DG]|z [B,DE] z [CEc]|
Also, just curious, is there a reason you used "@-5,55 Con Spirito alla Marcia."
instead of "^Con Spirito alla Marcia."
?
Hey Paul!
Thanks for the quick response.
Ahhh will look into the returned value of setupAudio()
- thanks.
Yeah we are probably slightly biased about Reactronica because my colleague Kaho Cheung made it in his spare time :) - plus our website at the DX Lab built in React. Means we can have 'music as a function of state' just the the rest of the site has 'UI as a function of state'. Its all open source (as is Virtuoso actually) if you wanna check out the code.
Ahhh thanks for explaining that %%MIDI
thing - didn't see that variety of notation on the ABC notation website - will try it out. In fact I'll do it right now. YES! It works! Excellent.
Oh re using the @
symbol to position the text annotation - it was just a fine tuning of the positioning - to make it look more like the sheet music in our collection. Once I discovered that feature I used it a lot!
Once again, huge thanks for your help Paul, it is greatly appreciated
:L
Just going to jump in on this thread!
Hi Paul, I work with Luke and I thought I'd mention that the codebase for our project is here: https://github.com/slnsw/dxlab-virtuoso
Thanks again for creating (and maintaining) ABC JS. Not sure if our project would have got off the ground with it.
Hi Kaho, I look forward to looking at your sound library, but I'm Team Vue, I'm afraid. I'm not sure what it would take to adapt it from React, but I'm guessing it is not trivial.
The original issue is fixed. I'm going to close this thread so that I can keep track of what's done. I have notes of the other features mentioned here.
Hello Paul
I work at the State Library of NSW in Australia, in a small team called the DX Lab. Our mission is to create interesting digital experiments that highlight subsets of the Library's diverse and unique collection.
As such we have been using ABCjs in a soon-to-be-launched project that lets the user 'hear' some of our sheet music collection.
As part of the interface we allow the user to click on a note to jump the song to that position. To do this I am using the currentTrackMilliseconds in the element that is returned when a note is clicked on, and then using that and the total length of the song to set the songProgress(). On the whole it works nicely, even when the song is in play.
Here is a development version of the site for you to check out.
Now to my issue(s):
I was checking out beta.14 and noticed currentTrackMilliseconds seems to have disappeared from that element.
Is that a permanent change? Or has that info moved somewhere else?
It would be a shame if it were removed permanently as we would loose this feature in our experiment.
There is a related issue in that, unless the song has been started (or started and stopped) setProgess() doesn't seem to work. The timer object definitely exists, but regardless of calling setProgress() starting the song always seems to play from the start.
You may be interested to know the site uses an audio library Reactronica made by my colleague @unkleho in his spare time(!)
Thanks for any help! :L