mrbungle73 / editor-on-fire

Automatically exported from code.google.com/p/editor-on-fire
Other
0 stars 0 forks source link

Allow MIDI playback #121

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
I was thinking about how work-intensive it is to prepare a new set of vocal 
tone samples.  It seems to be that since Allegro has built-in support for MIDI, 
it may be worthwhile to allow it to be a user-option to try to initialize the 
system's default MIDI device so that EOF's output can optionally be sent to a 
software/hardware synthesizer.

Besides just sending Note On and Off data, we'll need to consider how to allow 
the user to specify which synth tone to use.  When I created the piano tones 
for EOF, I learned a bit about this process, but haven't mastered it fully.  
I'm looking into the creation of Cakewalk instrument definition files, which 
seem to be the most common format of instrument definitions.  Anvil Studio will 
allow you to define information for its internal use if the MIDI package is 
purchased ($20 or so, but I don't know yet if it will allow an instrument 
definition file to be created).  I'm willing to get this to test creating 
definition files, and with any luck could write functions to import such 
definitions into EOF.

This would allow for a good deal of flexibility with customizing the 
sound/volume of vocal tones.  This may also be a good way to begin MIDI 
integration, should EOF be given the ability to accept MIDI input in the future 
(ie. piano record mode, if FoF gains the ability to play keyboard tracks).

Besides vocal tones, it could allow the user to supply their own clap, 
metronome, etc. without having to provide recorded audio samples to EOF.

Original issue reported on code.google.com by raynebc on 23 Jun 2010 at 5:22

GoogleCodeExporter commented 9 years ago
Re-reading the MIDI implementation document of my Casio, to enable a non 
General Midi tone for playback (when the synthesizer is being controlled by 
MIDI events) via MIDI commands, a Bank Select event followed by a Program 
Change event is required.  This is consistent with the method I devised when I 
was testing the recording of MIDI tones that I play back via MIDI file.  
Generally, I'm sure most synthesizers come with documentation for accessing all 
of the synthesizer's tones by specifying the bank and program change numbers 
for each tone.

So for customizing which sounds are played, the user could specify the bank and 
program change for their MIDI device.  For users that prefer to use headphones, 
they may be able to rig the OS to play back the synth's tones through the PC's 
current audio output device by plugging the synth's audio output to the 
computer's line in/microphone.  This would take some testing.

Original comment by raynebc on 24 Jun 2010 at 8:05

GoogleCodeExporter commented 9 years ago
After some discussion on the Allegro IRC channel, it doesn't seem like good 
results would be very feasible with Allegro 4x.  Allegro 5x is supposedly going 
to have much better timing functions, which would make playing a pre-created 
MIDI file, or manually sending Note On and Off events much more accurate.

Original comment by raynebc on 25 Jun 2010 at 6:00

GoogleCodeExporter commented 9 years ago
Since full MIDI playback might be problematic, perhaps EOF could instead have 
the user plug in a synth as a MIDI device, and plug the audio output into the 
computer's microphone.  That way, EOF could start and stop a MIDI note, and 
record the sound.  This would make it easy for people to record custom tones.

Original comment by raynebc on 28 Jun 2010 at 10:24

GoogleCodeExporter commented 9 years ago
With Allegro 4's cruddy timer the MIDI sound would be impossible to sync up 
with the streamed audio. Using the system's MIDI functions directly should 
deliver consistent MIDI output. It might be worth looking into whether there is 
a cross-platform MIDI output library. Allegro 4 provides a way to send MIDI 
events to a MIDI controller but it is subject to Allegro's timers due to the 
way it is implemented. We could probably rip the code from Allegro to make our 
own MIDI controller access routines.

We would need a new MIDI latency setting so we could sync the MIDI up to the 
digital audio stream. I think it would be feasible to tie MIDI output into the 
stream mixer callback so that the MIDI triggers at the correct time. We would 
just have to offset the triggers so that the MIDI latency is taken into account.

Original comment by xander4j...@yahoo.com on 29 Jun 2010 at 5:33

GoogleCodeExporter commented 9 years ago
I hadn't even considered piggy backing the MIDI output with the OGG output, 
that actually seems like a much more efficient way to do it.  The only 
difference with this compared to the current audio cue system is it would need 
to send a Note Off event at the appropriate time.

Allegro provides the midi_out() function to send raw MIDI data to the MIDI 
controller, but as you mentioned, I don't know if there's a considerable lag 
with this.  We'd probably just have to try it and find out.

This would probably be a good foundation for PART KEYS support in EOF.  If 
somebody was making a full band chart, having the chart audio, 
clapping/ticking, vocal tones and key tones all play at once may be a bit 
taxing on ALOG, especially since PART KEYS will have to support chords.  I'm 
not sure if there's really a limit on how many audio samples can play at once 
without causing problems, but offloading PART KEYS to a synth would lessen the 
burden of waveform audio playback.  It would also allow the user to play the 
keys track with any synth voice he/she wants.

Original comment by raynebc on 29 Jun 2010 at 7:04

GoogleCodeExporter commented 9 years ago
Note Off events wouldn't be a problem. It would just be another variable 
attached to the cue that tells it when to shut off.

Allegro's midi_out() cues a message up in Allegro's MIDI player which is timer 
based. We need a way to ouput MIDI messages directly to the MIDI controller so 
we get only the latency of the MIDI device.

I like the idea of having MIDI support. We should also have digital sample 
support in case the user's system has no MIDI support. Windows is good about 
providing a software synthesizer if there are no hardware MIDI devices but in 
Linux this is not provided and can't be had without some effort.

Original comment by xander4j...@yahoo.com on 29 Jun 2010 at 11:02

GoogleCodeExporter commented 9 years ago
If using midi_out() to send MIDI commands does introduce delay, that would be 
problematic..  Is it that the MIDI instructions are only sent to the system's 
MIDI device on a predefined interval determined by Allegro's timing system (ie. 
once per millisecond)?  I'll try to find out more, but I'm sure we can just use 
the internal code/functions that Allegro uses so we have ACTUAL real-time MIDI 
controls.

Having digital samples as a fallback was what I had in mind, such as if Allegro 
failed to initialize a MIDI device, but was able to initialize just the sound 
device.

Regarding the recording functionality I proposed, do you know if Allegro would 
be able to do the trick, or should I continue to pursue using Audacity to 
record and split the tones?  It would be cool if we could have an external 
download to add several sets of tones to EOF, perhaps extra percussions to 
replace clap/tick, and allowing for drum cue samples (ie. snare, tom, cymbal).

Original comment by raynebc on 29 Jun 2010 at 11:23

GoogleCodeExporter commented 9 years ago
From what I can tell, midi_out() places MIDI events in Allegro's MIDI queue. 
When the MIDI timer fires, the events in the queue are sent to the device. I'll 
look into it more but that's what I got from a brief glance at the code.

Allegro can record audio but I've never used that part of the API. If you could 
get the timing right it should be possible to automatically record MIDI tones 
as you proposed.

Original comment by xander4j...@yahoo.com on 30 Jun 2010 at 4:09

GoogleCodeExporter commented 9 years ago
So far, it's looking like midi_note_off() and midi_note_on() are likely to be 
the lowest level functions available that are cross-platform.  These each call 
midi_driver->raw_midi() with raw MIDI commands, and as far as I can tell, 
raw_midi() is the platform-specific MIDI driver.

Original comment by raynebc on 30 Jun 2010 at 10:39

GoogleCodeExporter commented 9 years ago
r240 and 241 begin work on MIDI note output.  Feasibly, it should work to have 
the callback queue a MIDI note to turn on and off when eof_music_pos is at or 
after certain timestamps.

Original comment by raynebc on 4 Jul 2010 at 7:36

GoogleCodeExporter commented 9 years ago

Original comment by raynebc on 31 Aug 2010 at 8:14

GoogleCodeExporter commented 9 years ago
r444 implements the currently untested logic.

Original comment by raynebc on 7 Oct 2010 at 8:38

GoogleCodeExporter commented 9 years ago
If and when keyboard charting is supported, it will be a good idea to ensure 
that MIDI data for playing back that instrument part is sent to a different 
channel than the MIDI data for PART VOCALS.  A dialog for selecting the 
instrument bank and tone number for each (PART KEYS and PART VOCALS) would also 
be nice.

Original comment by raynebc on 7 Oct 2010 at 11:19

GoogleCodeExporter commented 9 years ago
For some reason, the MIDI tones are triggering early.

Original comment by raynebc on 7 Oct 2010 at 11:58

GoogleCodeExporter commented 9 years ago
It seems about 200ms early exactly.  The chart's AV delay is 300 and the buffer 
size is 6144.

Original comment by raynebc on 8 Oct 2010 at 12:01

GoogleCodeExporter commented 9 years ago
The results will probably be better having a timed callback function to trigger 
note on/off events to send to a MIDI device.  This will likely be more feasible 
in Allegro 5.

Original comment by raynebc on 8 Feb 2011 at 6:54

GoogleCodeExporter commented 9 years ago
On hold until after the port to Allegro 5, which should have better faculties 
for this type of enhancement.

Original comment by raynebc on 22 Mar 2011 at 12:20

GoogleCodeExporter commented 9 years ago
This is very useful to verify that a chart is correct. I threw together some 
code for my own use for this, it isn't pretty but it works for me. :)

Original comment by quarns...@gmail.com on 22 Jun 2013 at 7:10

Attachments:

GoogleCodeExporter commented 9 years ago
Are you getting a good .25-.5 seconds of lag regarding the MIDI playback, or is 
it just me (I didn't hook up a hardware synth to try this, just using Windows' 
default MIDI device)?  I use an AV delay of 300ms, so perhaps this has 
something to do with it, seeing as the find claps logic provided is offsetting 
the pro guitar MIDI tones with this value.

I'm not sure why you implemented a while loop in the OGG callback though, 
shouldn't it work just by playing the note once when it's time?

Also, it looks like it should support playing several notes at once for chords, 
but what I'm hearing is that it's only playing one tone per note.  Do you get 
different results?

In the part where you determine what MIDI note to play, I'm not sure where you 
got those MIDI values.  Take a look at 
eof_lookup_default_string_tuning_absolute() to see the code I've implemented to 
look up the standard open tunings for 4/5/6 string bass and 6 string guitar.

Original comment by raynebc on 23 Jun 2013 at 2:03

GoogleCodeExporter commented 9 years ago
> Are you getting a good .25-.5 seconds of lag regarding the MIDI playback, or 
is it just me (I didn't hook up a hardware synth to try this, just using 
Windows' default MIDI device)?

I get them at the expected time if I add the eof_av_delay back (which the last 
patch also did, although I beautified it in this attached patch). My delay is 
120ms FWIW.

> I'm not sure why you implemented a while loop in the OGG callback though, 
shouldn't it work just by playing the note once when it's time?

The idea is to fire all notes in a chord at once.

>  but what I'm hearing is that it's only playing one tone per note.  Do you 
get different results?

Sorry, I implemented the code for this but didn't actually end up calling it :)
Fixed in the attached patch.

> In the part where you determine what MIDI note to play, I'm not sure where 
you got those MIDI values.

You're right, I'm one octave too low. The source I found when looking up the 
proper notes stated that 0 was C0, rather than the actual C-1.  I'm now calling 
eof_lookup_default_string_tuning_absolute in the attached patch.

Original comment by quarns...@gmail.com on 23 Jun 2013 at 8:39

Attachments:

GoogleCodeExporter commented 9 years ago
I applied those changes, with a couple alterations, in r1162.  I'm still 
getting lag though (at least 100ms).  Earlier in the ticket, NewCreature 
mentioned that Allegro's midi_out() didn't process MIDI commands immediately, 
it queued them for processing.  Apparently 100ms of delay isn't uncommon when 
using the default Windows software MIDI synthesizer.  Are you using a different 
one, like one built into your sound card, or an external synth (like a Casio)?

Original comment by raynebc on 24 Jun 2013 at 1:45

GoogleCodeExporter commented 9 years ago
I'm on OSX where the default is CoreAudio, and I'm pretty sure it's all 
software. Maybe a new config option for midi delay is needed?

Original comment by quarns...@gmail.com on 24 Jun 2013 at 5:44

GoogleCodeExporter commented 9 years ago
I'm going to look into some other options like plugging in a hardware 
synthesizer or using an alternate software synth (like BassMIDI).

Original comment by raynebc on 24 Jun 2013 at 6:03

GoogleCodeExporter commented 9 years ago
It works much better with a hardware synth attached and made the default MIDI 
device.  Another neat feature might be to allow the user to define what MIDI 
instrument is to be used, such as if they want to use a distortion guitar.  
This could be a simple dialog function or something.

Original comment by raynebc on 25 Jun 2013 at 7:22

GoogleCodeExporter commented 9 years ago
As long as the option of using midi hardware or emulated hardware via midi 
drivers doesn't disappear. I don't really have any, but I like the option of 
being able to plug in some hardware or some high quality driver/synth without 
EOF even having to know that something changed (ie it just sends the midi 
commands).

It would probably also be useful to be able to switch MIDI instrument 
dynamically during the song as for example clean guitar, muted guitar, and 
various effect pedals might be used at different points in it.

Original comment by quarns...@gmail.com on 26 Jun 2013 at 8:53

GoogleCodeExporter commented 9 years ago
I tried a couple of different guitar voices, but they sounded horrible 
(distortion) or don't gradually soften until the tone stops playing 
(overdrive).  I know the quality of the sound will vary from one synth to the 
next, but I imagine it's pretty standard whether a tone is one that plays until 
manually stopped, so it may require more code to force note off events to be 
sent.  To complicate that, the note off event shouldn't be too early (ie. short 
notes or even notes with no tail should be allowed to play for a half second or 
so, unless there's another note on that string sooner than that).  Having the 
instrument change dynamically is an interesting idea, but it may be better to 
refine the MIDI playback features before adding more bells and whistles.

People have been mentioning that notes will sound with a varying amount of lag, 
even when they're equally spaced from each other (ie. several 1/8 notes in a 
row).  I was thinking we should move the MIDI note playing logic out of the OGG 
callback function, since that function probably runs at unequal intervals.  
Some of the earlier MIDI playback logic I wrote (ie. eof_process_midi_queue() 
and related functions) may be a good place to look, or otherwise just a 
function that is called within EOF's main program loop that uses the code from 
the OGG callback to send events to the MIDI device at the appropriate time.

Original comment by raynebc on 26 Jun 2013 at 5:00

GoogleCodeExporter commented 9 years ago
In addition to better note off support, I was also thinking about supporting 
bends/slides via the midi pitch wheel command and changing the note on velocity 
for hammer-on/pull-offs and at that point either access to the full note data 
rather than the current "guitar_midi_note" structure would have to be used or 
we just make find_claps queue all the midi events.

If I read the queue code correctly, it removes events from the queue when the 
notes are turned off, which isn't great for seeking. 

Also I don't see myself using midi file export, but it would be good if the 
logic for playing midi tones and saving them could be merged into a shared code 
path where possible.

Do you want me to look at any or all of these?

Original comment by quarns...@gmail.com on 27 Jun 2013 at 10:00

GoogleCodeExporter commented 9 years ago
I added logic to move the firing off of MIDI events to be called in the main 
program loop.  This should help with the lag varying.  Also added a config file 
entry for a MIDI tone delay, so people can mostly make up for delay in their 
Operating System's MIDI handling.

The previous MIDI event queue logic probably isn't that suitable for our 
purposes, it's just some code I added a while ago that didn't end up working 
the way I wanted.  I'll probably just eventually remove it.

The MIDI files that EOF exports are for use in some rhythm games (Frets on 
Fire, Phase Shift, Rock Band), but they do not work like a standard MIDI.  The 
MIDI note numbers are used to mark gems played in-game, or varying statuses, 
and don't directly relate to the actual music notes being played.  Only the pro 
keys tracks in Rock Band use real MIDI note values to represent the played 
notes in-game, but EOF doesn't support authoring this instrument track yet.  So 
for now, MIDI export will remain separate from the MIDI tones feature.

Outputting MIDI events to account for those techniques would definitely be a 
nice touch.  I think we have at least a couple options on how to do this:

1.  Expand the guitar_midi_note structure to include information about any 
statuses in use.  eof_play_queued_midi_tones() or eof_midi_play_note_ex() would 
have to then write MIDI events accordingly.

2.  Change the guitar_midi_note structure to only store a timestamp and the EOF 
note number.  The MIDI tone playback logic would then be able to just read 
whatever statuses existed for that note by looking at the note's flags variable.

The first option is probably best for the sake of playback speed.

Original comment by raynebc on 27 Jun 2013 at 6:20

GoogleCodeExporter commented 9 years ago
> The first option is probably best for the sake of playback speed.

Is it that time consuming though?  Because on my machine the *whole* find_claps 
function finishes in 0.1-0.2 milliseconds (yes, 0.0001-0.0002 seconds) 
indicating that there's no need for any precalculation at all. Usually one 
wants the mix routine near realtime for instant feedback on interactions, 
however in EOF most of the sounds are static and we know well ahead of time 
exactly when it will be played back so latency is not an issue.

Even if realtime feedback were required,

a) We're not actually mixing in the midi notes into the buffer so it can be 
separate from the mix routine (and with your latest change actually is).

b) We don't need to process all notes, just need to keep track of the next one 
to play which means the actual overhead could be even smaller than the measured 
0.1-0.2 milliseconds. The additional call to midi_out and processing of the 
more advanced features of the note would have to be added to the overhead, but 
again we only need to process a single chord at a time at most and then we can 
sleep until it's time to play the next note. Any overhead of midi_out would be 
unavoidable anyway, no matter the approach used.

c) The smallest buffer size that can be used with my soundcard before the audio 
out from Rocksmith starts to crack appears to be 156 bytes. At a 48kHz 16 bit 
stereo configuration, that means an inherent latency of 
156/(44800*2*2)*1000=0.87ms. Moot point as again most sounds in EOF will be 
known ahead of time so the soundcard latency doesn't matter much as the user 
can just tweak the appropriate delay values to get it to sound in sync anyway. 

d) At a BPM of 120 at 4/4 the spacing between two quarter notes would be 
(60*1000/(120*4/4))=500ms. At a BPM of 360 at 4/4 the spacing between two 128th 
notes would be (60*1000/(360*128/4))=5.2ms. A song at that tempo written with 
all 128th note length barre chords every single beat for a 12 string guitar is 
very unlikely, but 5.2ms would then be 5,2/12=0.43ms per string in the chord 
which still is more than double the budget compared to the actual measured time 
the full find_claps function took with the songs I tested it with and again, 
see b).

Original comment by quarns...@gmail.com on 28 Jun 2013 at 12:15

GoogleCodeExporter commented 9 years ago
It depends.  If people change the active difficulty or track during playback, 
it has to rebuild all the sound cue times.  In extreme cases, people author 
really long charts (full album length) and I'd seen where changing the active 
difficulty/track during playback would cause EOF to stutter and desync.  On 
less powerful computers, external processes can use enough resources to cause 
EOF to lag here or there also.  So as far as I see it, every reasonable 
optimization helps.

For best results, I think the start time and note number of each note to handle 
for MIDI tones needs to be tracked before playing, to account for the user 
seeking during playback.  That's the fast part anyway, the MIDI events are only 
going to be sent once a note is actually reached.

Original comment by raynebc on 28 Jun 2013 at 4:06