DerekCook / CoreMidi4J

Core MIDI Service provider Interace (SPI) for Java 1.7 and above on OS X
Eclipse Public License 1.0
54 stars 15 forks source link

Time base needs to be consistent #6

Closed brunchboy closed 8 years ago

brunchboy commented 8 years ago

It was a great idea to implement getMicrosecondPosition() for the CoreMIDI4J devices, but they need to return time values that are consistent with the time stamps that are being attached to MIDI events. Both of those values are supposed to represent “the device’s notion of time” according to the Java MIDI documentation, and in Core MIDI, device’s notion of time is monotonically increasing microseconds since the kernel booted. That is what the time stamp values currently are in MIDI events. But right now, when someone calls getMicrosecondPosition(), the value being returned is not consistent with these time stamps—instead, the time at which the port was opened is recorded, and subtracted from the current time stamp value, and that is returned.

MIDI code is supposed to be able to reason about the meaning of an event time stamp by relating it to the value returned by getMicrosecondPosition() by the device on which the event was received or to which it will be sent, so these time values need to be measured from the same point.

When I look at a MIDI event time stamp right now, for example, I see this value: 231956774679. Calling getMicrosecondPosition() on the same device gives me the value 102831564. Then getting another time stamp a bit later yields 232803489903. The time stamps are not in the same time base as getMicrosecondPosition(). And the Java MIDI documentation states they should be:

To learn whether a device supports time stamps, invoke the following method of MidiDevice:

 long getMicrosecondPosition()

This method returns -1 if the device ignores time stamps. Otherwise, it returns the device’s current notion of time, which you as the sender can use as an offset when determining the time stamps for messages you subsequently send.

I would suggest that the simplest and most natural approach for working with CoreMIDI would be to remove the startTime instance field, and always simply return the value of calling getMicrosecondTime().

If for some reason you want to continue in a mode where each port has a different sense of when time began, then you will need to adjust the time stamp values attached to MIDI events returned by that port so that they have the port’s startTime subtracted from them too. To be safe, you should also perform the opposite operation on any time stamp values coming from Java towards Core MIDI, in case any other Core MIDI clients try to honor them; they should follow the Core MIDI specs once we hand them off to it. Thus I would lean towards the option of removing startTime, and simply staying with the Core MIDI definition of device time.

DerekCook commented 8 years ago

I believe I have correctly implemented the notion of Device Timestamps according to the Oracle Java documentation:

From the Javadocs for MidiDevice and the getMicrosecondPosition() method.

long getMicrosecondPosition()

Obtains the current time-stamp of the device, in microseconds. If a device supports time-stamps, it should start counting at 0 when the device is opened and continue incrementing its time-stamp in microseconds until the device is closed. If it does not support time-stamps, it should always return -1.

Returns: the current time-stamp of the device in microseconds, or -1 if time-stamping is not supported by the device.

And from the Oracle page Transmitting and Receiving MIDI Messages

To learn whether a device supports time stamps, invoke the following method of MidiDevice:

long getMicrosecondPosition()

This method returns -1 if the device ignores time stamps. Otherwise, it returns the device's current notion of time, which you as the sender can use as an offset when determining the time stamps for messages you subsequently send. For example, if you want to send a message with a time stamp for five milliseconds in the future, you can get the device's current position in microseconds, add 5000 microseconds, and use that as the time stamp. Keep in mind that the MidiDevice's notion of time always places time zero at the time the device was opened.

So, I think the implementation is correct, even if not quite what is logically expected.

One thing we could do, is preserve the "contract" of the method as a default, but put a method into CoreMidiSource and CoreMidiDestination to allow you to change the timestamps to absolute values if you want it.

DerekCook commented 8 years ago

Thinking about it a little more.....

There is probably no sense in CoreMidiSource supporting timestamps. You are receiving messages that have an absolute time stamp, and, thinking about it, I can't see much point in getting timestamps from the device; you are interested in the message timestamp. So I may revert CoreMidiSource to returning -1 for getMicrosecondPosition()

CoreMidiDestination should, I think, support timestamps, but is currently not doing anything different if the message has a timestamp other than -1. Any message not with a timestamp of -1, and which has a future time reference compared to CoreMidiDestination's time reference should be queued, not sent on immediately. There should then be a Timer routine that periodically checks the queue and sends any messages that are now due to be sent.

Should not be too hard to do, and is part of improving the usefulness of CoreMIDI4J to other developers.

brunchboy commented 8 years ago

It’s a bit of a tricky situation. The timestamps from the device are supposed to relate to the timestamps in the messages. You can query the device and use that to decide what you want to use in the timestamp you send to make the event happen later, or realize that you have been asked to delay acting on an incoming message. I believe that in CoreMIDI’s definition of timestamps, all devices are “opened” when the kernel boots, even if they were connected later. As long as it is consistent, it works.

As I noted in my initial discussion, if you want to hew closer to the spirit of Java’s documentation, we can offset the message timestamps for each device by subtracting the startTime when sending them from Core MIDI to Java, and adding it back when sending them from Java to Core MIDI. That would honor the expected behavior in both worlds. And I think if we go down that approach, we should simply do it, there is no need to complicate matters with a configuration option.

But I don’t think that CoreMIDI4J should ever delay the message. If it gets a message with a future timestamp, it should pass that along. If, for example, you are using Apple’s network MIDI implementation to send it to a synth running on your iPhone, the timestamp will be passed along in the network packet, and the synth can delay it appropriately before playing it, perhaps to synchronize it with some other event that is known to be coming up, and the future-dating is used to smooth out network lag. Having CoreMIDIDestination delay the send would destroy the protocol’s ability to account for network jitter. I would think that if a CoreMIDI-to-hardware interface received a future-dated packet, it would be its responsibility to delay sending it to the MIDI hardware, not CoreMidiDestination’s.

brunchboy commented 8 years ago

The RTP MIDI standard which is implemented by Core MIDI in OS X and iOS for Apple’s nice, user-friendly network MIDI implementation, RFC 4695, has lots to say about timestamps. The problem of how to bridge the world of modern network protocols, which send timestamp information, and old slow MIDI 1.0 DIN cable serial interfaces which do not, is intractable and essentially insoluble, but we don’t have to deal with it. Some hardware interfaces will undoubtedly just ignore the timestamp information we send, but I am not worried about those, they are not the situations in which people are trying to use timestamps to counteract network jitter anyway. I just want to make sure we don’t break things for people using the modern protocols which do support and honor timestamps.

I think we can do that either by sticking with Core MIDI’s timestamps, which are always interpreted relative to kernel boot, or the way you have done them, which is to subtract the device startTime from them, as long as we apply that translation to messages going from Core MIDI to Java, and reverse it for messages going in the other direction.

Interestingly, Humatic (the MMJ folks) offer a compatible network MIDI implementation for android, nmj, and I have experimented with a Windows implementation called rtpMIDI to control Windows-only laser show software from Afterglow (although I ended up moving to a fancier version of the software which offered more flexible direct network control). It is a surprisingly thriving world.

brunchboy commented 8 years ago

Thinking about it even more… the packets I have seen, coming from USB MIDI controllers connected to my Mac, have timestamps whose “zero” is when the Mac booted. But packets coming over network interfaces, or from sequencer software, might well (even very likely) have different “zero” points. And I don’t know of any API that will let us query Core MIDI to determine that value. That API is what we would need, I think, to correctly implement getMicrosecondPosition() because it is supposed to correspond to the timestamps in the packets for that device.

So I am coming to the conclusion that unless we can find the API by which we can delegate the responsibility for getMicrosecondPosition() to Core MIDI, we should admit that we do not know what the current time is for any given device, and return -1. And, when it comes to the timestamps within MIDI packets themselves, we should do nothing other than converting them from Mach Absolute Time Units to microseconds when going from Core MIDI to Java, and in the other direction when going from Java to Core MIDI. That will allow protocols like RTP MIDI to work as designed without interference from us.

And I think we have forgotten about that last bit, converting from microseconds to Mach Absolute Time Units when receiving timestamps from Java… or have we not?

DerekCook commented 8 years ago

I think we might actually be in danger of agreeing with each other!

To reiterate, timestamps from CoreMidiSource are sent with the messages, and I am not proposing to change that. I am proposing to revert getMicrosecondPosition() to returning -1 as it is meaningless in this context, given the received messages have absolute timestamps.

In terms of going from Java to Core MIDI, that is where some intelligence is needed if the source program wants to send messages in the future, and why I was proposing to change the implementation to allow that. In this case, unless I am reading the Java documentation wrong, this is the whole purpose of the getMicrosecondPosition() method for when you transmit messages where you do care about the timestamp. The Java documentation is quite clear about the reference being zero at the point you open it, and programs need to offset as required relative to the current time reference returned by the device if they want messages to be sent at a future point. in this context, I still think my first reply stands, unless I am completely misinterpreting the Java documentation. So long as you are offsetting relatively, then the actual start time is a moot point (unless you are at risk of overflow), but the docs seem clear to me.

brunchboy commented 8 years ago

I think we are agreeing as well! And while all the messages we have seen so far going from CoreMidiSource have had absolute timestamps, I don’t think that will necessarily be the case for packets coming over the network. Or rather, while they may still be absolute, the origins are going to be different, because the boot time of the source device (if it is an Apple device) was different, and other vendors may use different time bases.

Or I suppose it is even possible that CoreMIDI does the hard work of synchronizing network time bases for us, and normalizes timestamps from all devices, even RTP MIDI devices over the network, to the same Mach Absolute Time Units since the local kernel booted. So much unknown…

But in the absence of certain knowledge of nice features like my speculative prior paragraph, returning -1 from getMicrosecondPosition() in CoreMidiSource is the right thing to do unless we discover a way to query Core MIDI for the underlying time base origin of that port, and I would be surprised if we can. So Java code can use the timestamps to derive relative timings of events, at least.

Going the other direction, we need to convert from microseconds to Mach Absolute Time Units, and we need to translate from Java semantics to Core MIDI semantics as well. The Core MIDI reference says:

timeStamp The time at which the events occurred, if receiving MIDI, or, if sending MIDI, the time at which the events are to be played. Zero means "now." The time stamp applies to the first MIDI byte in the packet.

So to follow along with your approach of using a relative startTime for the device, first we should check if the message coming from Java has a -1 value for the timestamp, which is the Java “don’t care” flag. That should be translated to a 0 for Core MIDI. (And, indeed, all of the messages that I am sending from Afterglow currently have timestamps of -1, which should be translated to zero.)

If we see a different value in the timestamp from Java, we should treat that as a time relative to the startTime we are using to report getMicrosecondTime(), and then convert that from microseconds to Mach Absolute Time Units.

However, should that be an absolute time? The Core MIDI reference is dreadfully terse. But given that Mach Absolute Time Units are rather explicitly defined as such, I think we need to assume so. It would be nice to hear from someone more expert in the field. I think one of us is going to eventually end up needing this book. I suspect Core MIDI wants us to request specific playback times (when we pass in non-zero timestamps) as monotonically increasing time units since kernel boot, like it uses elsewhere. And if that turns out to be true, then we need to add back startTime if we had subtracted it when calculating getMicrosecondTime() for Java.

The only things I am fully confident about right now, I guess, are that we should be returning -1 for getMicrosecondTime() for CoreMidiSource, translating timestamps of -1 to 0 for CoreMidiDestination, and not ever delaying messages, regardless of the timestamp we are passing along. If you would like to make Java comfortable by having the microsecond time base that getMicrosecondTime() reports for CoreMidiDestination start at 0 when you open the port, I think that is fine, but I think then we’ll need to add back startTime microseconds to any non -1 timestamps before we convert them to Mach Absolute Time Units and send them to Core MIDI.

DerekCook commented 8 years ago

I think we are agree on incoming messages: -1 for CoreMidiSource.

Outgoing messages need a little experimentation, but I don't think we need to get too complicated if you let Java handle any required delays - it needs no changes on the native side, which is currently hard coded to send messages with a timestamp of 0 (now), and of course a message with a timestamp of -1 goes immediately. Alternative is what I think you are suggesting: Java never delays messages, but does the timestamps shenanigans for CoreMIDI to worry about when to send. 6 of one; half a dozen of the other, I think, but if CoreMIDI always sends when timestamp is 0, I would be more comfortable experimenting with a Java implementation of delaying messages when requested.

PS: I've already got that book in eBook form, but didn't find it as useful as I thought; however it might helpful in this case, so I'll see if it has any nuggets.

brunchboy commented 8 years ago

OK. I really think we need to stay away from delaying messages. Nothing I have seen anywhere suggests that the transport layer is responsible for doing this, and it will play utter havoc with people who are trying to use future-dated timestamps to synchronize performances across the network, which I know people in my live-coding music performance community actually do.

As far as the values to use in the timestamp, yes, I agree it seems six of one and half a dozen of the other. Experimentation, and perhaps talking with people who actually use them. We are good for now with sending 0, and perhaps we can wait until we have an actual user who wants to do something else to tell us what behavior they expect and need.

DerekCook commented 8 years ago

With what I am talking about, a message with -1 time-stamp will be sent instantly (as per present) by CoreMidiDestination, so what is the issue with experimenting with the implementation also delaying messages if hinted at by the application using a different timestamp? To be clear if there is no hint, i.e. -1, then the message will be sent with no delay.

Just because we don't need it, doesn't mean to say that other apps will not be expecting it, so I am trying to anticipate that.

Unless I am missing something, CoreMIDI4J is not going to work with people in your community (assuming they need it) if they do use future dated timestamps - CoreMIDI4J currently sends everything instantly regardless of what the timestamp value is, which I think is not a good thing.

The alternative is that I will shortly lose the will to live on this debate and revert to CoreMidiDestination not having any notion of time

brunchboy commented 8 years ago

Let’s try one more round. I do think you may be missing something I am trying to express: the timestamp value is a piece of information that is passed along with the MIDI event. Its purpose is to let you achieve things you simply cannot do by delaying and then sending the MIDI event. That purpose is to tell the final receiver of the event that you want it to be implemented at some specific time in the future, in a way that is independent of any variable delays which might occur in transporting the event, as a way of smoothing out those delays.

The only place where you need to actually implement a delay based on the timestamp is when you are transitioning from a realm that supports timestamps to a realm that does not: the actual software synthesizer turning a MIDI note into audio, or a hardware interface converting the event to MIDI 1.0 DIN cable signaling, which has no timestamps. Since CoreMIDI4J exists in between two realms which both support timestamps, all we need to do is faithfully pass any we receive as inputs along to the outputs, in the format that either side expects them to have; microsecond values in the case of Java, and Mach Absolute Time Units in the case of Core MIDI. That way, we enable faithful communication through our leg of the relay. Java code can set a timestamp, we translate it and pass it to CoreMIDI, which can then translate it and pass it as an RTP MIDI packet which follows the RTP MIDI RFC and gets sent over a network, then to some receiving MIDI implementation (perhaps Core MIDI again on OS X or iOS, perhaps one of the others I mentioned), and finally to some underlying software, which has a chance to look at the timestamp and schedule the event to happen at the correct predictable time after all the unpredictable network communication steps have occurred.

If we do that, we do no damage to the end-to-end communication that people might be trying to accomplish through their timestamps. On the other hand, if we start trying to implement the delay ourselves, we destroy the ability of senders to have the packet received at its destination well enough in advance of when it is supposed to take effect to smooth out network jitter, because we have used up all of the slack before the packet was even put on the network.

The part that we are stuck on, as I see it, is that we are unsure of what Mach Absolute Time Units to use when posting packets from Java to Core MIDI. My best guess is that we want to translate the delta in microseconds between the Java timestamp (assuming it is not -1) and what getMicrosecondTime() would report for the CoreMidiDestination, into a delta in Mach Absolute Time Units from the current system time, which seems to be the Core MIDI time base. But that is just a guess, and it would be nice to hear from someone who has actually implemented timestamp based CoreMIDI software. Perhaps the book can turn up a nugget of use? Or perhaps we can find a mailing list with experts on it, or open a support request with Apple, or we might even get lucky with a question on StackOverflow.

But it seems to me that even that guess would be better than simply throwing away the timestamp information and always sending a 0, or implementing a delay ourselves and then sending 0, for the reasons I outlined above. Until we know what we are doing, though, perhaps returning -1 from getMicrosecondTime() for CoreMidiDestination as well is best.

This article seems consistent with the view that Core MIDI timestamps are always expressed in host time, and highlights the approach of sending messages over the network early with future timestamps.

And again, the Core MIDI API reference defines MIDITimeStamp values as being the same as host clock time:

A host clock time representing the time of an event, as returned by mach_absolute_time() or UpTime().

Indeed, I even found someone commenting on another Github project where they found code that was using Grand Central Dispatch to delay and schedule a MIDI message to be sent in the future, and they said this:

I believe the intention of this method is to send a midi note in the future right? If so, then instead of using GCD to do it I would use the CoreMIDI feature that let you send a note to be played later in the future using a timestamp, one big advantage of using this CoreMIDI feature is that you let CoreMidi deal with the latency and all the crazy stuff you dont want to deal with so by preparing midi messages to be triggered later using a timestamp, in theory you also get rid of the latency ...

This poster does not even mention the deeper advantage I was talking about, of situations when Core MIDI is talking to other systems that also support timestamps, and so the timestamp can be passed on, and acted upon by the final recipient. But even apart from that he is probably right that Apple’s multimedia engineers are more experienced than we are at getting realtime scheduling down.

So it appears we are circling in on stable definitions of the timestamps in Java and in Core MIDI. In Java, they are always microseconds, relative to getMicrosecondTime() for the device, with -1 meaning now. In Core MIDI, they are always Mach Absolute Time Units, relative to when the kernel booted, with 0 meaning now. I think our job is fairly clear, just translate back and forth between those formats. And I do think it is entirely up to us whether we want getMicrosecondTime() to be zero when the system booted, or when the device was opened. Either way is fine; the latter is more compatible with what Java expects, so we might as well implement it that way. But if we do that, we should probably do that for both inputs and outputs, and have it return microseconds since the device was opened in each case, and translate the MIDI packet timestamps (whose values do not mean now) going to Java by converting Mach Absolute Time Units to microseconds and then subtracting startTime, and in the other direction, adding startTime to the microseconds from Java, then converting to Mach Absolute Time Units.

brunchboy commented 8 years ago

I looked again at the Java Tutorial about MIDI messages, and it is pretty clear that you are right, we should be using a time base within Java that starts from zero when the device is opened. I think the easiest and cleanest course of action would be to do the converting between microseconds and Mach Absolute Time Units on the native side, since that is where the library functions exist to facilitate that, and since we are sending unsigned values back and forth with Java. So we send these unsigned microsecond values, which are relative to system boot time, back and forth between our native side and our Java side. In these values, 0 is used to represent “now”.

On the Java side, between CoreMIDI4J and the user of the SPI, we perform an additional translation, which is to subtract the device’s startTime value when we are reporting timestamps, and to add it when we accept them. That way, timestamps and getMicrosecondTime() are exactly what Java expects, and consistent with each other, so you can perform the sorts of operations mentioned in that tutorial:

This method returns -1 if the device ignores time stamps. Otherwise, it returns the device's current notion of time, which you as the sender can use as an offset when determining the time stamps for messages you subsequently send. For example, if you want to send a message with a time stamp for five milliseconds in the future, you can get the device's current position in microseconds, add 5000 microseconds, and use that as the time stamp. Keep in mind that the MidiDevice's notion of time always places time zero at the time the device was opened.

For that to work, we need to have a consistent time base, which was exactly why I opened this issue. And then when the client does that, and gives us the timestamp for a MIDI message 5000 microseconds in the future, we on the Java side add back our startTime value before passing it to the native side, so it becomes microseconds since the system booted. We pass this microsecond value to the native side. On the native side we convert that into Mach Absolute Time Units, and now we have a timestamp in the format CoreMIDI expects, which represents the time the user intended: five microseconds into the future. I was going to try implementing this quickly, but I just noticed that Java_uk_co_xfactorylibrarians_coremidi4j_CoreMidiOutputPort_sendMidiMessage is missing the timestamp parameter, and mucking with the JNI headers is more than I feel up to tonight.

But I think this is the right answer, is straightforward to implement, satisfies the contracts of all sides, and allows the timestamp to be used to communicate programmer intent correctly across the Java to CoreMIDI boundary.

Analogously, suppose we get a timestamp on an incoming packet from CoreMIDI representing a time 4000 microseconds ago. We convert it to microseconds on the native side, pass it to the Java side, and there subtract our startTime. The client Java code thus receives a timestamp which is relative to when the device was opened, as spelled out the Java MIDI spec. If the code wants to know what time that timestamp represents, it can call getMicrosecondPosition() on the port, and see a value that is 4000 higher than the timestamp, telling it that the event occurred 4000 microseconds ago.

DerekCook commented 8 years ago

I think you have been missing some of the things I have been saying as well! The truth of what needs to be done is probably somewhere in the middle. The extra piece of information you've given above, that makes me tend more towards agreeing with you about where timestamps are processed (and makes extensive redebate of missing points nugatory) is if you are expecting time stamps to "go outside of the box" to other devices before they hit a MIDI Port. So, yes, I agree with you if timestamps are sent across the network using RTP - some of my points (quite valid if you stay the box) become invalid in that context.

So, reading (very quickly) what you have written above, we are probably converging on what needs to be done, and you've probably captured it in the last post.

One of those things where a quick start at a feature mushrooms into something of more complexity, but the right thing to do in the long term.

brunchboy commented 8 years ago

Yes, I have no doubt that I missed some of what you said, in struggling to better convey bits of what I was trying to explain. That is a definite pitfall of communicating in writing with a time delay! Your points are completely valid if we have no “outside the box” mechanism to convey the timestamps. And what I was abjectly failing to express is that there is indeed such a mechanism, from Java to Core MIDI to RTP MIDI, which carries this extra information. And we were both confusing the notion of “when you send the message” with “when the message should take effect”, and the latter is what the timestamp conveys, independent of the former.

Also, I have clearly—looking back on this thread—learned and forgotten a variety of details more than once. Another pitfall of doing a deep dive on another project while trying to also talk about this one. Sorry!

But I agree that we seem to be converging. I think we now understand the semantics of timestamps in both the Java and Core MIDI worlds well, and can reliably translate between them. And that this is all we need to do, as we are simply a bridge between those two worlds.

In Java

Timestamps are signed microseconds which start at zero when the device is opened. You can determine what the current timestamp would be by calling getMicrosecondPosition() on the device. Individual devices have their own time bases. To convey you want a message to take effect “right now” you use a timestamp of -1.

In CoreMIDI

Timestamps are unsigned Mach Absolute Time Units that start at zero when the kernel boots. You can determine what the current timestamp would be by calling mach_absolute_time(). All devices share the same time base. To convey you want a message to take effect “right now” you use a timestamp of 0.

Convert CoreMIDI Timestamps to Java

In Native Code

In Java Code

Convert Java Timestamps to CoreMIDI

In Java Code

In Native Code

I think that gives us perfect fidelity in translation of the timestamps in both directions. And we can implement getMicrosecondPosition() the same way for inputs and outputs: Just get the current system time in microseconds and subtract the device’s startTime. That will allow the Java client code to correctly interpret the timestamp relative to current time, as specified by the Java MIDI API.

DerekCook commented 8 years ago

Yes, I think we got there, and can kiss and make up now. ;-) The debate has gotten to where it needs to get to. You are 100% correct about the problems of timezones. I do work for Australia, and there is no substitute for being in the same timezone and occasionally in the same location!

Leave it with me to mull on the "requirement" and make sure I fully understand it. I will also do some background reading in that Core Audio book to see if it gives any nuggets of interest - either additional info or at the least verification or otherwise of the above.

I've assigned the issue to myself, and after getting another recording session with my singer done tonight (starting shortly), I will start a new branch to investigate this.

In the mean time as much stress testing of the 0.3 release between yourself and users like Vlad would be good.

brunchboy commented 8 years ago

Whoops! After getting it so clearly spelled out, I could not resist going ahead with an implementation, which is nearly finished. I will commit that to a separate branch, and you can either use it or do it your own way as you see fit.

DerekCook commented 8 years ago

OK. I will leave it until you have finished on your branch and check that over. No desire to make what you've done nugatory or reinvent it for the sake of it.....

brunchboy commented 8 years ago

All right! I have it working. Now when I look at timestamps in Java, they relate properly to the values returned by getMicrosecondPosition() for the device on which they were received. And when I send a note-off message with a timestamp a million microseconds greater than the value returned by getMicrosecondPosition(), followed by a note-on message with a timestamp of -1, what Core MIDI actually does is play the note on immediately (and first, even though I sent it second), followed by the note-off, a second later. I would say we have cracked it! I will check over the code and push the branch.

DerekCook commented 8 years ago

OK, sounds promising

brunchboy commented 8 years ago

Derek, I have some more interesting news about this. I just did some comparison testing, and (perhaps unsurprisingly), the Sun/Oracle Mac MIDI implementation completely ignores timestamps. When I try the same test I mentioned above, both messages get sent immediately, and in the wrong order.

But even though the Sun/Oracle MidiDevice implementations do not support timestamps, they report positive values for getMicrosecondPosition(), which violates the Sun API specification; they are supposed to return -1.

So this is another area in which we now offer correct functionality where the default implementation does not.

DerekCook commented 8 years ago

That's interesting! I'm curious what happens on PC now, which I might try sometime.

Sounds we are cutting edge now....

I'll check out the branch when I get back from work tonight.

brunchboy commented 8 years ago

Oh, that’s a good point! I don’t have any (non-virtual) Windows machines on which to try it, so I would be curious as well. I would not be surprised to find out that this is a Mac-only concept, though; Apple has always been ahead of the curve when it comes to thoroughness in multimedia and networking features. Indeed, I just skimmed the MSDN documentation for Windows MIDI APIs, and they seem to lack all notion of timestamps, instead slavishly recreating the limitations of 1980s point-to-point serial MIDI.

One thing I do want to try soon is to set up an RTP MIDI stream between two macs, and send some timestamped MIDI events between JVMs with CoreMIDI4J on them, and see when they get delivered and processed.

DerekCook commented 8 years ago

Only had time for a quick check tonight (and will be away for a few days), so I will leave this branch open in case you wish to add to it. A quick test with one of my synths shows no functional issues, but I'd like to test with all of them (as we have made changes in JNI), and experiment with sending messages with timestamps other than -1 just to verify on a different setup.

Only issue I have found is I am guessing you built the Jar file with JRE 8, therefore I was getting incompatible class version exceptions when I tried to run my apps when launched in JRE 7 (my current baseline, although I do test in JRE 8 as well). If I switched to JRE 8 it as fine, and if I recompiled to JRE7 then it was fine as well.

I guess that raises the question of what should be the minimum JRE. Given JRE7 is now past end of life (no more security fixes), then maybe JRE 8 should be the minimum. Now we have a working MIDI SPI then the reason why I was held back as gone. MMJ held me back to JRE6, and I couldn't get OSXMIDI4J to work with my apps on JRE 8, which held me back to JRE7. Whilst developing CoreMIDI4J I had everything setup for JRE 7. No, reason to move to JRE 8 only, but probably need to test a little more carefully.

What do you think?

That's probably as much as I have time on this before I get back on the weekend (but will be checking issues/comments. But good work. I think we will get to V0.4 on the weekend!

brunchboy commented 8 years ago

Oh, whoops, sorry! I agree, I think it makes sense to be kind to the broadest set of potential users, since we have no need of Java 8 specific classes or features, and continue building the releases for JRE 7. I will stop pushing jars and zips and leave the final build stage to you since I don’t maintain any Java 7 environments.

In any case, thanks for the once-over, did you have a chance to review the code itself?

Given what we have learned about the rarity of timestamp support in general, there is no rush to get a new release incorporating it out. It will just be something we can pat ourselves on the back for once we do. :wink: And I can do the multi-machine testing I am curious about with the builds I have made on this branch.

brunchboy commented 8 years ago

Hold up, I can fix this easily, and retain flexibility in who makes a release in a pinch. I shall just update the Ant build file to specify that the source and target Java releases are 1.7. Let me make that change and push another build, for when you have a chance to confirm that it works properly with your Java 7 environments.

brunchboy commented 8 years ago

All right, it was a bit more of a pain than I hoped, as I need to have a copy of Java 7’s rt.jar in order to faithfully compile code that targets it. But I downloaded an archival release of that version from Oracle, and updated the build file to require that be present in order to compile the source. This won’t affect you, since you are doing the compile from Eclipse, using a project configured to use the correct JDK. But if someone else wants to compile using Project Builder.xml they will need to do the same. (We don’t want to redistribute rt.jar as part of this project; Oracle is very picky about partial distributions of Java.)

Please let me know if my revised jar and zip files work with your Java 7 environments; they should.

brunchboy commented 8 years ago

I have repeatedly forgotten to add this thought, but finally remembered: I’m not sure you need to test with all of your synths, because the nature of the JNI change—the adding of an outbound timestamp parameter—was such that sending MIDI either works every time, or the JVM crashes every time, depending on whether the Java and native libraries are properly matched. Now, if you would like to take the time to move them all around and do the testing, lovely, I won’t try to stop you. But I don’t think it is nearly as urgently needed as when I rewrote the core MIDI data processing loop.

DerekCook commented 8 years ago

Thanks, James. Think this is done now. Works fine with my AN1x. I'll test with other synths as I move my librarians towards release, which is the next step I think. There is now a new V0.4 release tag.