Closed ryanheise closed 3 years ago
Hi Happy to help i run the code example code for one-isolate branch but getting this error
[!] CocoaPods could not find compatible versions for pod "flutter_tts": In Podfile: flutter_tts (from
Flutter/ephemeral/.symlinks/plugins/flutter_tts/macos`)
Specs satisfying the flutter_tts (from
Flutter/ephemeral/.symlinks/plugins/flutter_tts/macos)
dependency were found, but they required a higher minimum deployment target.`
Ah, yes. I think there was a commit that added flutter_tts
back in but it looks like it wasn't tested. I'll fix up the deployment target and commit, but in the meantime you ought to be able to test older commits before flutter_tts was reintroduced.
I dont think we can check action center here in "Simulator" but when i minimise the app there is small problem in music.
i will check this in mobile(iPhone) later.
Regarding the flutter_tts issue, this should be fixed in the latest commit, along with an onSeek
precision bug.
I was unable to reproduce the background behaviour - what versions of everything were you testing on? (audio_service, iOS, iPhone, mac...)
The control center is not accessible on iOS, but when things are working, you may be able to activate the lock screen and see the media controls there. But as things are not working now, you might not see them. Alternatively, you can try testing it on macOS (flutter run -d macos
) on Big Sur, and you should then have access to macOS's control center.
OK this is weird, even when I test the old version which is known to work, my environment is no longer showing any control center either on the iOS simulator lock screen or the macOS system control center. I'm sure I had this displaying last week, so something must have changed in my environment.
Ya and with latest commit flutter_tts issue is fixed but macOS control center is out of order updating state like play / pause
onSeek works but not properly
path_provider: ^2.0.1
audio_session: ^0.1.0
just_audio: ^0.7.3
flutter_tts: ^3.0.0
rxdart: ^0.26.0
audio_service:
path: ../
cupertino_icons: ^1.0.2
Currently, I'm jealous that you can even see that much of the control center, I'm not sure what I've done to my environment, but I'm seeing absolutely nothing for the past couple of days.
But yes, those issues in the video are the ones I observed a couple of days ago when I still could test this. I think onSeek
is working now, but the position is just calculated incorrectly because of an incorrect duration. So the issues are with the incorrect duration set, and generally the state not successfully updating.
Are you able to test any older commits? There are a lot of commits, but not that many that mention iOS or macOS.
No i am able to test any older commits 😞. will check and let you know.
Hi, I have tested my app which uses audio_service's one-isolate branch (latest commit 7a840c5) coupled with just_audio, Following are my observations: A. Tested on iPhone 6 Plus iOS 12.5.2:
B. Testing on iPhone SE 2nd Gen iOS 14.3 simulator:
Common observation in both real iPhone and simulator: For the first song, just_audio was returning negative position duration, always, until and unless I seeked manually through the song, or skipped to the next song. After skipping, it works properly.
Thanks @shripal17 for the real device testing!
2\. It seems your latest commit [7a840c5](https://github.com/ryanheise/audio_service/commit/7a840c538937aec5b46d3bbc651fe8f35f5630c4) is invalid for iOS 12? App receives seek in microseconds, but when i use that duration as milliseconds, it works properly as expected.
I'm not sure if I completely understand. The microseconds format is only used internally by the plugin for communication over method channels but by the time it is delivered to the app it's converted to a Duration
object.
When iOS receives the seek event from the control center, it does this:
[handlerChannel invokeMethod:@"seekTo" arguments: @{
@"position":@((long long) (event.positionTime * 1000000.0))
}];
Then on the Flutter side of the plugin it receives it like this:
case 'seekTo':
await callbacks.seek(SeekRequest(
position: Duration(microseconds: call.arguments['position'])));
It is this Duration
object that is then delivered to the app which can be queried in the app's desired format.
If you can, it would be great to try out some old commits (e.g. pre federated architecture) as that will help to find the cause and the solution.
Hi @ryanheise,
I'm not sure if I completely understand. The microseconds format is only used internally by the plugin for communication over method channels but by the time it is delivered to the app it's converted to a Duration object.
I mean, when I pass the duration to player as Duration(milliseconds: position.inMicroseconds)
it works as expected.
So, the resulting code looks like this:
@override
Future<void> seek(Duration position) => player.seek(Duration(milliseconds: position.inMicroseconds));
The alpha testers of my app also saw the behavior that the play/pause button being disabled (along with an out-of-sync progress bar) when just_audio
is stopped due to an error. However, I wasn't able to find out why.
I recently introduced a feature that stops everything when there is an error, but this problem existed long before my addition.
As for now our team is focused on getting out Android version out, but I'll certain keep an eye on this.
Hi, I tested a bit on iPhone 12 mini.
With the lastest https://github.com/ryanheise/audio_service/commit/7a840c538937aec5b46d3bbc651fe8f35f5630c4 seekTo works from control center --> app, the position is reflected. However it does not work the other way around. When I use seekTo from the flutter app side, the audio starts playing from the new position, but this change in position is not updated in control center. The control center continues playing from the old position like nothing happened. If you force another state playbackStateChange, such as play/pause, the control center finds the correct position again. Interestingly the same happens on skipToPrevious, the position in control center is not updated but the mediaitem data is updated.
@esiqveland your description makes sense. It seems that "seek" is technically using the correct scale (microseconds vs milliseconds) in each direction but it's just that sometimes iOS/macOS ignores our request to refresh the control center display. If we log the calls on the iOS/macOS side, we can see that it's making the call, but we just need to figure out why that call is being ignored.
It's very confusing and mysterious, but that is why it may be helpful to track down older commits to see when this broke and then look at the diff to see what code may have been responsible for it.
Just a correction,
Although i had mentioned ref
as one-isolate
in pubspec, it took some older commit, no idea why, today i changed it to 7a840c538937aec5b46d3bbc651fe8f35f5630c4 and now seeking through notification centre works perfectly.
Although, seeking from app still doesn't reflect changes in the notification centre.
Please guide me, I am ready to experiment to help you fix these issues
Thanks, @shripal17 . I really appreciate the help!
The way I think we can narrow down some useful information is via the binary search method.
We have all these commits: A B C D E F G H I J K L M N O P Q R S T U Z W X Y Z.
We know that A works and Z is broken (with "A" being the base of the one-isolate
branch and "Z" being HEAD). So we test commit M (somewhere in the middle - is it a coincidence that "M" is in the "M"iddle of the alphabet, by the way??). If it works, we try "T" and if it doesn't work, we next try the commit somewhere half way between "M" and "T" etc. That way I think it would be possible to narrow down which specific commits broke specific behaviours.
One more important thing to note is that after the branch point, I initially developed solely the Android side and the iOS side didn't catch up and start working again until e789a13640bd59e74a8c295ebb93ef31c1612753 where you could say I got some basic functionality restored. You can check the git logs and search for keywords iOS or macOS to see the relevant commits on the iOS/macOS side.
Ahh, I got your approach, i see that the iOS code is done in objective-C? Unfortunately, I have zero experience in Objective-C and near-to-zero experience in Swift. But at least I could be of some help if it were in swift as it is comparatively easier to understand and somewhat similar to Kotlin (in which I have expertise)
I'll start with the binary search approach starting with the e789a13 commit in a few days and will keep you updated!
Could this be a reason why it does not update playback position while already playing, coming from a playbackStateEvent?
This would skip updateNowPlaying when already playing and only receving a new playback position.
Unfortunately not, updateNowPlaying
is only concerned with updating the media item metadata (title, artwork, duration) and the play/pause state. One bug I can see there is that a speed change should also trigger an updateNowPlaying
call, but the original bug is still there. the nowPlaying info is not updated even when this method is called, which appears to be every time except for speed changes.
Could this be a reason why it does not update playback position while already playing, coming from a playbackStateEvent?
It is exactly it. On iOS the seekbar is going forward by itself we the the speed is 1.0, so we don't actually need to broadcast these updates. When we seek, the update event should be fired separately from playback event stream, but the nowPlaying position update is muted by this condition, essentially, any updates other than play/pause are ignored
Unfortunately not,
updateNowPlaying
is only concerned with updating the media item metadata (title, artwork, duration) and the play/pause state. One bug I can see there is that a speed change should also trigger anupdateNowPlaying
call, but the original bug is still there. the nowPlaying info is not updated even when this method is called, which appears to be every time except for speed changes.
I don't see anywhere else in the file that would set a new playback position when position change is coming from Flutter side.
I might be misunderstanding what this part of the plugin actually does though. This would however explain why it helps to pause/unpause to get an update through, and would explain why my skipToPrevious
action is does not move the playback position in the iOS mediainfo, because I only do a seekTo(Duration.zero)
on the Flutter side when skipToPrevious
is called.
FYI I just commented out this condition and it works as it was intended
Actually, on Android it's the same, we only need to update the position when we seek, not continuously
I suggest that we add a utility stream for notification updates, that could also resolve https://github.com/ryanheise/audio_service/issues/658
(upd: or at least help resolving it)
I'm not convinced commenting out this condition fixes the underlying issue. If you insert a log into updateNowPlayingInfo
, you will see that it is actually being called at various appropriate times and yet not having the desired effect. For example, if you click the skip next/previous buttons in the control center, it will trigger a call to updateNowPlayingInfo
and yet the metadata still isn't updated in the control center.
Incidentally, the split between setMediaItem
and setState
is originally from Android's model and doesn't map perfectly to the way the iOS control center works, so we do not want to update the nowPlayingInfo
every time there is a state change, we only want to update it when the media item changes or the play/pause/speed changes.
To me, there are still mysteries here in the behaviour I'm seeing and I think we need to understand them to ensure that a robust solution is implemented.
Actually, on Android it's the same, we only need to update the position when we seek, not continuously
Not really, time discontinuities can occur not only during seeks but also with stalling, drifting and speed changes (because it sets a new base time for position projections).
I'm not convinced commenting out this condition fixes the underlying issue. If you insert a log into
updateNowPlayingInfo
, you will see that it is actually being called at various appropriate times and yet not having the desired effect. For example, if you click the skip next/previous buttons in the control center, it will trigger a call toupdateNowPlayingInfo
and yet the metadata still isn't updated in the control center.Incidentally, the split between
setMediaItem
andsetState
is originally from Android's model and doesn't map perfectly to the way the iOS control center works, so we do not want to update thenowPlayingInfo
every time there is a state change, we only want to update it when the media item changes or the play/pause/speed changes.For example, if you click the skip next/previous buttons in the control center, it will trigger a call to updateNowPlayingInfo and yet the metadata still isn't updated in the control center.
skipToPrev/skipToNext from control center is working fine for me at least, except when skipping only does a seekTo from Flutter side. When actually doing a skip, mediaitem and playback position is correctly updated for me.
From @girish54321 's video above:
Around the 43 second mark, clicking on the skip to next button does not update the nowPlayingInfo in the control center. However, logging this method call will show that it is being called at the right time.
I'm not convinced commenting out this condition fixes the underlying issue. If you insert a log into updateNowPlayingInfo, you will see that it is actually being called at various appropriate times and yet not having the desired effect. For example, if you click the skip next/previous buttons in the control center, it will trigger a call to updateNowPlayingInfo and yet the metadata still isn't updated in the control center.
I cannot reproduce what you described here, when I comment out this condition and make updateNowPlayingInfo
always fire, it fixes the problem completely with no side effects
Not really, time discontinuities can occur not only during seeks but also with stalling, drifting and speed changes (because it sets a new base time for position projections).
Ok, so then that's yet another reason this condition should be removed
And https://github.com/ryanheise/audio_service/issues/684#issuecomment-827642366 has the same reason I believe
Video with condition commented out:
This is truly mysterious, macOS nowPlaying is not being fixed by this
This property only applies to macOS. You must set this property every time the app begins or halts playback, otherwise remote control functionality may not work as expected.
https://developer.apple.com/documentation/mediaplayer/mpnowplayinginfocenter/2588243-playbackstate?language=objc Gonna test this out
Not really, time discontinuities can occur not only during seeks but also with stalling, drifting and speed changes (because it sets a new base time for position projections).
Ok, so then that's yet another reason this condition should be removed
But position changes are not the concern of nowPlayingInfo
.
https://developer.apple.com/documentation/mediaplayer/mpnowplayinginfocenter/2588243-playbackstate?language=objc Gonna test this out
Ah, that rings a bell. I must have passed over that several times thinking "later", because macOS was working anyway (on the master branch). Let's see.
Ok, so then that's yet another reason this condition should be removed
I would just add more conditions to it, e.g.:
if (playing != wasPlaying || ![speed isEqualToNumber:oldSpeed]) {
[self updateNowPlayingInfo];
}
But position changes are not the concern of nowPlayingInfo.
it is
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = [NSNumber numberWithInt:([position intValue] / 1000)];
https://github.com/ryanheise/audio_service/issues/684#issuecomment-827660160 I don't think this would fix position is not updated when the new one received
But position changes are not the concern of
nowPlayingInfo
.
Sorry I was in error here. That does certainly explain the position issue at least. But again I would update the condition with another one to check if the position has changed. One other thing to perhaps look at is that we are not actually using updateTime
. Each time you set the position in nowPlayingInfo
, that position should really be a projection from the updateTime
base. I'm not sure what MPNowPlayingInfoPropertyCurrentPlaybackDate
does, maybe it would provide us automatic handling of updateTime
although I haven't tested that.
I would update the condition with another one
What if title is changed? What if artwork is changed? Artist? We have to then check everything and this won't pay off any performance, I think even may make it worse, since it will be almost certainly different
Those ones are not part of the setState
case, they are part of the setMediaItem
case. There are only a very small number of things in setState
that need to be transferred to the nowPlayingInfo
. setState
can be very noisy and receives a lot of state updates that should not require a refresh of nowPlayingInfo
so we should do our best to avoid updating nowPlayingInfo
unless one of its fields actually needs to be updated.
Each time you set the position in
nowPlayingInfo
, that position should really be a projection from theupdateTime
base. I'm not sure whatMPNowPlayingInfoPropertyCurrentPlaybackDate
does, maybe it would provide us automatic handling ofupdateTime
although I haven't tested that.
It appears MPNowPlayingInfoPropertyCurrentPlaybackDate
may have something to do with the broadcast date in HLS streams, so it's probably not what I wanted here.
I managed to get it working - playbackState
and MPNowPlayingInfoPropertyPlaybackProgress
were required
Will clean this up to make a PR, could you please check in the meantime https://github.com/ryanheise/just_audio/issues/391
Nice! So macOS requires MPNowPlayingInfoPropertyPlaybackProgress
as well? The documentation didn't seem to indicate it was required, particularly if we have the more finegrained MPNowPlayingInfoPropertyElapsedPlaybackTime:
Note from the documentation of the above:
Elapsed time is automatically calculated, by the system, from the previously provided elapsed time and the playback rate. Do not update this property frequently
Hence why we need this guard condition.
Yeah, that's also why we should discourage devs from using playbackEventStream directly
The documentation didn't seem to indicate it was required, particularly if we have the more finegrained MPNowPlayingInfoPropertyElapsedPlaybackTime
Apple never changes
Actually, turns out I was wrong, playbackState
is sufficient. It didn't work for the first time for some reason, which led me think that MPNowPlayingInfoPropertyPlaybackProgress
is required.
Leaving notification to rely on playbackEventStream to update the position is highly expensive. Do you think this will be a good solution https://github.com/ryanheise/audio_service/issues/684#issuecomment-827629143 ? One serious drawback is that when artwork is often updated, which inevitably happens in this case, because each time the new map is created, it will accidentally unload sometimes
I think that can be optimised away by checking on the iOS side that something's actually changed before updating nowPlayingInfo
.
If we are updating it every second, this is already too much. Isn't it?
Hah, ok, I found that on OS level now playing is broken:
https://drive.google.com/file/d/19Fc82IfEfXlctD98Hi6JS9VbqanX1Ovu/view?usp=sharing
Disappearing of the artwork might be just another bug in OS
I suspect this has something to do with the fact that audio_session does nothing on macOS, and that prevents audio_service from taking exclusive control of the audio session while playing audio. There's a bug in Xcode that prevents AVAudioSession from compiling on macOS even though its documentation says that it's supported. Online you can find instructions on how to fix Xcode, but we can't rely on all users to fix their copy of Xcode in order to compile the plugin.
I'm off to sleep for the next 8 hours, but a big thanks @nt4f04uNd for helping to investigate this bug!
Help Wanted
We are preparing the upcoming 0.18 release of audio_service on the
one-isolate
branch with a number of performance, feature and API improvements.The last critical piece of the puzzle is to restore the iOS/macOS control center functionality.
If you would like to help, the more eyes on it the faster we should be able to make progress. As a tester, you could help by checking out via git older versions of the code on the
one-isolate
branch and running them to see if the control center was working, and in this way identify which specific commit/s broke the control center. That in turn will help us identify the relevant code and come up with a fix. If you find a commit where something broke, please share the commit reference below.If you have some experience with iOS programming, you can help in other ways by reviewing and debugging the code. Some ideas of what to look into can be found at the bottom of this bug report.
Which API doesn't behave as documented, and how does it misbehave?
playbackState
andmediaItem
state changes are forwarded to the iOS/macOS platform side but they do not consistently result in changes to the control center and now playing info.Minimal reproduction project
The example on the
one-isolate
branch exhibits the issue as is.To Reproduce (i.e. user steps, not code)
Steps to reproduce the behavior:
You may find one of various problems. E.g. the first state update is reflected in the control center but subsequent ones aren't. Or you may find that the buttons are disabled. Or you may even find that the first state update isn't reflected.
Error messages
None
Expected behavior
Actions in the Flutter UI that trigger state changes should be reflected appropriately in the control center.
Screenshots
None
Runtime Environment (please complete the following information if relevant):
Flutter SDK version
Additional context
Here are some potential things to look into:
nowPlayingInfo
race condition can prevent updates: https://stackoverflow.com/questions/34867294/mpnowplayinginfocenter-nowplayinginfo-not-updating-at-end-of-trackMPRemoteCommandCenter
,MPNowPlayingInfoCenter
andAVAudioSession
to ensure they're all working in coordination. E.g. checking that the timing of these calls works together.MPRemoteCommandCenter
,MPNowPlayingInfoCenter
andAVAudioSession
APIs in both versions of the code and try to spot any differences that could explain the behaviour.