igorski / MWEngine

Audio engine and DSP library for Android, written in C++ providing low latency performance within a musical context, while providing a Java/Kotlin API. Supports both OpenSL and AAudio.
MIT License
257 stars 45 forks source link

java interface from MWEngine C++ #141

Closed scar20 closed 2 years ago

scar20 commented 2 years ago

My project use this waveform view as a display and edit range of the current sample: https://github.com/newventuresoftware/WaveformControl It have an interface to update the cursor while playing and signal when reaching the end.

public interface PlaybackListener {
    void onProgress(int progress);
    void onCompletion();
}

This in turn is hooked to an (android.media) AudioTrack playback position listener in the playback thread.

         audioTrack.setPlaybackPositionUpdateListener(new AudioTrack.OnPlaybackPositionUpdateListener() {
            @Override
            public void onPeriodicNotification(AudioTrack track) {
                if (mListener != null && track.getPlayState() == AudioTrack.PLAYSTATE_PLAYING) {
                    mListener.onProgress((track.getPlaybackHeadPosition() * 1000) / SAMPLE_RATE);
                }
            }
            @Override
            public void onMarkerReached(AudioTrack track) {
                Log.v(LOG_TAG, "Audio file end reached");
                track.release();
                if (mListener != null) {
                    mListener.onCompletion();
                }
            }
        });

Then I though I'll simply hook the interface of the view to..... mmmm.... well, all I have to do is simply create a .... in SampleEvent that will generate a java interface auto-magically by SWIG. Simple? well,,, How this is done/doable? I've looked at solutions in stackoverflow and realized that's not as simple as I though... abstract classes, header functions, callbacks... Which will work best?

What I see for updating the cursor would be a callback that will report the pointer position at each start/end of read buffer length (since we don't want to update at sample rate), that should be well enough updates without taxing too much (???) the process. But still I wonder how that would be translated to java.

I realize now the issue is more complex than I first though. Or maybe not - tying to be optimistic... For now I can just do without it but as soon I'll finish packaging the first draft app, I need to get into this. So any advise on where to look for, docs, how-to, or if you have already an idea will be welcome. It look like a hard question, I know...

igorski commented 2 years ago

It may sound like a "hack" but this can be done rather cheaply without needing to hook into an engine callback.

Basically if you know the sequencer is running and that you have a SampleEvent, you can use the SampleEvents getReadPointer (for sequenced event) or getPlaybackPosition methods to determine the current playback offset for the event by relating that value to the Samples eventLength / bufferRangeLength (when using custom ranges).

You can poll this at the same rate you wish to update the screen (60 fps) whenever you have your waveform display open for that specific event (or events) ?

scar20 commented 2 years ago

Kind of reverse way around... using onDraw() to call the update, I can see that, even though it look a bit strange. If you say that's cheap, I'll trust you :). I though performance would have been better from C++ but maybe not worth.

Anyhow, I also need a signal for when a sample have reached the end - not looping - and stopped. Oh, wait, still with onDraw() checking playbackPos and bufferRangeEnd while not looping, ahhh... I'll try to make it lightweight. Going to bed now... Lets do that tomorrow.

igorski commented 2 years ago

Well it won't necessarily melt your CPU, but if every event is broadcasting their position updates just for the benefit of being able to visualize playback position I would suggest to do it the other way around. Give the visual renderer the responsibility of determining what the progress is, and only when it is on screen.

It won't be sample accurate as the screen refresh is not tied to the current/last buffer rendering cycle but the difference should be negligible.

You can however send messages from your native code directly to Java using the Notifier/Observer pattern. So you could extend SampleEvent to broadcast its position whenever the mixBuffer method ends. Be aware though that this will fire a lot more times than your screen will update (as the framerate is lower than the amount of audio buffers processed per second), hence my suggestion of using the inverse approach.

scar20 commented 2 years ago

Yeah, given my device pref buffer size (96), it will fire 500/sec - about what a current extreme graphic nvidia RTX like can produce. Almost 10 time the standard 60hHz refresh rate, no need for that. I'll look for system callable (WindowManager perhaps) to get pre onDraw() call to hook the interface.

On the other hand, a signal to indicate if a sample is actually playing or not could be useful and no big deal. Thanks to point me out the observer pattern which I overlooked. Of course, adding notifier.h and broadcast(myNotificationId) when appropriate (sample play/stop). I'll look into how you set the RECORDING_SNIPPET_READY notification to get a good model.

Thanks, I fell enlightened.

scar20 commented 2 years ago

Well, I could get a signal from SampleEvent though the notifier but instead of only one at end of sample read, I get way more... Seem I didn't look at the right place to fire the signal or (more probably) that my understanding about how the sample is actually read was wrong. I will take a better look later on.

So instead of messing around with the class, I've instead take the "from the java side" route also for onCompletion signal. Now its ok and all my buttons state become desactivaled at sample (or range) end (forward & back) as expected. Hurray.

BTW, I found this gem on StackOverflow for hooking on the Choregrapher to sync a callback, a small snippet in kotlin (which I'm not fluent) that had proven perfect for my use case, nice: https://stackoverflow.com/questions/44036646/how-to-sync-to-android-frame-rate

Lastly now I need a short buffer to feed waveformview. For this I now have to reopen the issue that I had previously closed, not to intermix subjects.

igorski commented 2 years ago

Unless I'm mistaken, there's no action left here right @scar20 ? :)

I also resent using the word "hack" in my earlier message. The world of audio rendering and display rendering are two very different things (due to the audio rendering frequency being higher than the display). By having each of them manage their own update interval you are doing the right thing. So technically the display might be "behind" the actual audio "event" you are representing visually. But that will always be the case. The human eye will however not perceive it as such.

scar20 commented 2 years ago

Yea, I forgot this thread. Problem solved, I'll close it. BTW, in my version of sampleevent.cpp , I added a signal for when a sample stop playing. I find it useful to change state of UI at the end of a sample. I had to add an (optional) id field also to know which sample had stop. Could be a nice addition to this class but please, refrain to make change in this class yet; I'll send you a pull request once I'll be able to get back on it to clean up some mess I've done - now I'm in the middle of a freak bug, and also a transplant for an aging computer...