Open willhaslett opened 2 years ago
No benchmarking tests were done, sorry.
From what I see you load the sound data just before playing it. This may be a reason for a delay (load
pushes the payload to the native code through the channel). Can you try following method?
/// some kind of state variable that should be set as soon as possible
int soundId = await pool.load(soundData);
///...
/// and in the gesture callback do
onPointerDown: () => pool.play(soundId);
This way you separate loading process from triggering the playback.
Thank you for the reply. I do get the same latency with that method. Of course, there has to be a penalty for the extra call, but it's negligible for what I'm after. I hope to make a real-time musical instrument, and latency of < 50 ms is required for that.
That method is also worse for my purposes in that if I tap the same UI element repeatedly, I only get sound about once per second, even though the callback is hit every time. By calling pool.load()
instead, I do get sound for every tap, regardless of how fast I'm tapping.
I did notice the second comment on this SO answer, which seems to say that a problem has arisen with iOS latency? I'm going to test on Android today and will add a comment with the results.
I get 155 ms on a Pixel 3a running Android 12 with a release apk. I ran the latency test app from Superpowered and the Pixel came in at 20 ms.
If your library didn't exist, I would start here for Android and here for iOS, with the plan being to wrap each with a thin, common API that facilitates communication with the Dart layer.
Since your library does exist (it appears to be the best thing going for low latency audio in Flutter [0]), I wonder if it might be smarter to fork your repo and see what I can do, rather than going closer to the metal as I describe above.
Since you have the experience here, do you have any advice?
[0] The only other candidate seems to be audioplayers. I ran the example app and tested low latency mode, having replaced the example app's MP3 source (it begins with silence) with a drum hit WAV. It was slow enough that it wasn't worth measuring.
TLDR; I'd suggest to fork the repository if your main concern is iOS latency - I am aware there is a "faster" API available. I you are going to try using native libraries on Android (OpenSL or AAudio) then it would be better to make your own plugin - you'd have control on what API you provide.
I must admit that I haven't really tried to push for lowest latency implementation anywhere. My initial goal was to make a bridge to the SoundPool
Android API which is supposed to be use when you need low-latency playback. The iOS/web and MacOS implementations were done to provide feature parity but I don't have enough experience with these platforms to make a really performant implementation for them. I have in mind trying to rewrite the iOS part to AVAudioEngine
but I've got no enough time/motivation to try and do that. Another option may be to try and do the implementation using FFI which should make better results but it would require obtaining some knowledge about how it can be done for each platform. As an experiment I've made an implementation in Rust for Windows and Linux and it worked (although I haven't measured latency) - it may be worth to try it for mobile devices too.
Many thanks for the advice! Also, thank you for the totally pro commenting in this code, which will make it so much easier to work with! Cheers.
If you need any help do not hesitate to write me.
I'm also developing a Flutter app that plays sounds. The lowest latency I've found for Android was flutter_ogg_piano (an average of 20ms on Samsung S7) after trying audioplayers and soundpool and finding their latencies too high. For iOS, soundpool was the best I could find but still has an average of 70ms (on iPhone 6), so I'm always interested in lower latency alternatives.
@JimTompkins @willhaslett Can you write how do you test the latency in your apps? I'd like to perform similar tests to have my own baseline to try and improve.
@ukazs123 I set things up so that I'm recording the sound in the room (a built-in laptop microphone will do) and the mic is picking up both the sound of my finger tapping on the phone and the sound coming out of the phone from the app. To pick up the tapping sound, I tap hard. Then it's just a matter of visualizing the waveform of your recording and measuring the time between the two onsets. You can look at the peaks of the two noises, but for max precision, zoom way in on the wafeform and measure from "instant" (the first disturbance you can see in the waveform) that the tap/sound begins. For the sound coming from the app, use a quick-onset percussive sound, like a snare drum. I use Logic Pro to measure the onset-to-onset distance, but this could be done in any number of apps. I imagine that Audacity would support this (free).
Hi Łukasz,
If you have an app that uses a button press event to play a sound, you can:
1) make an audio recording (e.g. MP3, wav) using a sound recorder app while your app is running, and click on the button. Repeat several times so you can calculate an average latency. You may need to do it "loudly" e.g. with your fingernail to make sure it comes through in the recording. 2) open the audio recording using e.g. Audacity. Effect-->Normalize the file to make the sounds easier to see. Zoom in on a button click event and you should see the sound of the button press followed by the sound being played by soundpool. Select the region from the button press to the sound being played. Audacity will show the start and stop time of this region at the bottom of the screen.
I really appreciate your work on soundpool!
Jim
On Sat, 19 Feb 2022 at 16:13, Łukasz Huculak @.***> wrote:
@JimTompkins https://github.com/JimTompkins @willhaslett https://github.com/willhaslett Can you write how do you test the latency in your apps? I'd like to perform similar tests to have my own baseline to try and improve.
— Reply to this email directly, view it on GitHub https://github.com/ukasz123/soundpool/issues/84#issuecomment-1046095446, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSL6ADATJ3QXMEVYONPW33U372XJANCNFSM5K6376QA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
You are receiving this because you were mentioned.Message ID: @.***>
Jim Tompkins H: +1-902-453-3541 M-personal: +1-782-234-5038 note: new!
Btw, If the aim is to build a musical instrument (app), a good rough test is playing something percussive (like a beat) along with a song that's playing. Without something playing, you can play the thing and even if you can tell there's a little lag, it may seem okay. But, when you try to play live along with something else that's playing, that's when you'll notice if it's a no-go. It'll be like you showed up to the gig on heavy sedatives and you're always trying to find the beat. Things fall apart with a latency roughly somewhere between 20 and 50ms.
I prepared special "example app" for Android benchmarking (see here).
I've run the app in release mode on Samsung Galaxy A12 phone and tried to play a sound only several times.
From logged timestamps I read that the implementation of Soundpool plugin seems to be pretty fast - only less than 5 ms from start of the tap gesture callback to the invocation of native Android Soundpool
instance. Then the call to Soundpool.play
took between 30-40ms. I believe there is not much I can really do to improve performance in existing implementation.
Then I checked recording using Audacity and I noticed that latency seemed to be around 200ms on this device. Then I realized that I was using InkWell
to run the _triggerPlaying
method. I switched to GestureDetector.onTapDown
instead and the latency dropped to around 120ms. Maybe you can use similar optimization in your apps.
Anyway I don't really know where the majority of latency comes from. I appreciate any suggestion on what I can still check.
For Android, based on all information here so far, I think that @JimTompkins' results with flutter_ogg_piano
may point to the way forward. flutter_ogg_piano
uses Oboe, which checks out in terms of its low latency.
Yes, I agree. I think flutter_ogg_piano has the lowest latency on Android because it uses a library closest to the OS. Is there an equivalent to Oboe for iOS?
On Fri, 4 Mar 2022 at 08:08, Will Haslett @.***> wrote:
For Android, based on all information here so far, I think that @JimTompkins https://github.com/JimTompkins' results with flutter_ogg_piano may point to the way forward. flutter_ogg_piano uses Oboe https://developer.android.com/games/sdk/oboe, which checks out in terms of its low latency.
— Reply to this email directly, view it on GitHub https://github.com/ukasz123/soundpool/issues/84#issuecomment-1059106272, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSL6AEDL2WQSK3522KYY73U6H4KBANCNFSM5K6376QA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
You are receiving this because you were mentioned.Message ID: @.***>
Jim Tompkins H: +1-902-453-3541 M-personal: +1-782-234-5038 note: new!
I'm starting to look into the BASS library from un4seen Developments, but I have no experience with platform channels in Flutter.
Any further developments on your side?
For my part, I've had to shelve this due to taking a gig at a high-growth shop, a job that was designed for a 23-year-old who doesn't sleep and has no responsibilities.
That said, I think there's huge opportunity here, open-source and commercial, to lead with real-time musical instruments that have "low enough" latency. It was a problem in the early days of desktop virtual instruments, and it'll get solved here too, by someone.
I've taken a look at Superpowered's library but the cost is prohibitively high.
I've started on a plugin to use the BASS library from un4seen developments but haven't gotten very far. It's my first use of FFI.
Looking through the soundpool.dart file, I notice that the play method really just has two lines that matter i.e. after the assert statements:
int poolId = await _soundpoolId.future; return await _platformInstance.play(poolId, soundId, repeat, rate);
Is there any way the poolId could be prefetched and saved to save an async function call?
Update: I've got my flutter_bass example app working now using the un4seen BASS library. See this repo. The latency on an iPhone 6 is ~120ms but I'm still tinkering with configurations, etc. Stay tuned!
I added soundpool to my BASS example app to compare the latency in the same environment. BASS gives a mean (N=5) latency from button-push-to-playback-start of 115ms compared to 238ms for soundpool. See the README for a more detailed description.
I'm moving on to trying to integrate BASS in my full app.
A huge improvement! In terms of real-time performance, ~100 may may work for sounds that have a slow onset, like synth pads and the like. Or, it could work for *any sound in loop-based sequencing, where you're clicking on a cell in real-time to create a note for an instrument the next time the loop comes around, and anywhere in the cell duration works.
I'm getting 120 ms of latency with this using already-loaded
ByteData
. Is there anything I can do to substantially reduce this latency? Am I doing it wrong? Has any benchmark testing been done?Details:
Simple test app. This is called by the
onPointerDown
handler of aListener
widget.soundData
is a class member that was previously created usingrootBundle.load(someFileName)
.