ryanheise / audio_service

Flutter plugin to play audio in the background while the screen is off.
803 stars 480 forks source link

iOS Support #10

Closed hacker1024 closed 4 years ago

hacker1024 commented 5 years ago

I'm creating this issue to keep track of an iOS implementation. I myself am an Android developer, and I'd love to see an iOS developer contribute.

ryanheise commented 5 years ago

To potential iOS contributors

This post describes the work to be done on the iOS side and what might be the best way to break it down.

Android and iOS differences

The architecture of this plugin was really dictated by the nature of Android's component system, so first let's try to understand why things must be the way they are if the plugin is to work on both Android and iOS. In Android, an app is made up of a set of individual components which may be dynamically spawned and destroyed over the lifetime of the app. An audio playing app needs an "Activity" component which is only allowed to run foreground code (i.e. the UI), and a "Service" component which is only allowed to run background code (i.e. playing audio). The idea is that the Activity could be swiped away by the user, i.e. destroyed and freed from memory, while the Service component is allowed to live on and play audio in the background. Flutter automatically sets up an isolate inside the activity to run your Dart code, but since this activity could be swiped away at any time, we can't put the audio playing code here. We therefore use a relatively new Flutter API to create our own isolate attached to the Service for running Dart code, and this is where the audio playing code needs to go. Flutter's API for background execution of Dart code is described in this Medium article.

iOS developers in the thread discussion below have pointed out that things fortunately may be a lot simpler on iOS since there is no such separation of foreground and background code. This may mean that the iOS implementation of the plugin would not need to use Flutter's background execution API. However, to main compatibility with Android's behaviour, I would suggest that the iOS implementation should still spawn a plain old isolate for running the background audio code. This would ensure call-by-value semantics is maintained on both platforms.

Architecture overview

This plugin sets up a Dart isolate for running audio playback code in a way that continues to run when the app is not in the foreground. The dart code that runs in this isolate is called the "background task". The architecture allows one or more "clients" to connect to the background task to control audio playback and be notified of state changes to update the UI. One of those clients is the Flutter user interface, but other clients may include a Bluetooth headset or a smart watch. There are distinct APIs for the Flutter client side and for the background task:

For example, AudioService.play() sends a message to the background task which is handled by its onPlay callback. But the message is not sent directly from point A to point B. Rather, it must go through a platform channel to the native plugin so that the plugin has a chance to do all of the things that a native app must typically do before playing audio. On Android, this involves acquiring audio focus and calling various APIs to allow the app to keep running while the app is not in the foreground and while the screen is turned off (acquiring a wake lock and starting a service) as well as showing a notification to indicate that the app is playing audio in the background. Once all of these things are done, the plugin THEN passes the message on to the Dart background task via its onPlay callback. Regardless of which client the play command originated from, the plugin should handle the command in the same way by performing any required native rituals to prepare for audio playback and then finally pass control to onPlay.

Stage 0: Foundation

A good starting point would be to implement the method call to start the background isolate, and to implement all other method calls either to directly pass through from one isolate to the other, or even implement them as no-ops. The only methods that absolutely need to work in this stage are:

With this much, the demo example app should start playing audio and will stop once the audio has run out by itself, but will not allow clients to control playback and will not notify the clients of state changes.

Stage 1: Basic functionality

The basic functionality provides the ability to control playback with the basic play/pause operations (which can also be controlled with a headset button click), and also the ability to stop the background task on request. The relevant method calls to implement are: connect, disconnect, ready, isRunning, setState/onPlaybackStateChanged, stop/onStop, pause/onPause, play/onPlay, setMediaItem/onMediaChanged and click/onClick. I'll be happy to answer questions about any of these in the comments below.

There are also a set of method calls that allow jumping to different positions and tracks that don't require any special set up on the native side so all they do is forward the messages on to the background task. They are: seekTo/onSeekTo, skipToNext/onSkipToNext, skipToPrevious/onSkipToPrevious, fastForward/onFastForward and rewind/onRewind.

The plugin should also have a default case for custom actions (whose method names begin with "custom_". These are also just forwarded directly to the background task without any special setup on the native side.

Stage 2: Queue functionality

This functionality adds the ability to manipulate the playlist/queue and jump to an arbitrary media items. The relevant method calls are: addQueueItem/onAddQueueItem, addQueueItemAt/onAddQueueItemAt, removeQueueItem/onRemoveQueueItem, skipToQueueItem/onSkipToQueueItem and playFromMediaId/onPlayFromMediaId.

Stage 3: Browsing functionality

This adds the ability for clients to browse media items offered by the background task. The relevant methods are: setBrowseMediaParent, notifyChildrenChanged, onChildrenLoaded and onLoadChildren.

Addressing API differences between Android and iOS

When there are equivalent features on iOS and Android, the preference is to implement both sides with the same API. If things would be easier on the iOS side if the API were changed, please make the suggestion below! I would definitely prefer to change the API if it means that the Android and iOS APIs can be harmonised.

In cases where a purely iOS-specific feature is desirable, it can be added as long as it is named with an ios prefix.

hacker1024 commented 5 years ago

Stage 3: Browsing functionality This adds the ability for clients to browse media items offered by the background task. The relevant methods are: setBrowseMediaParent, notifyChildrenChanged, onChildrenLoaded and onLoadChildren.

Read the Android docs to see what these methods should actually do: https://developer.android.com/guide/topics/media-apps/audio-app/building-a-mediabrowserservice

alexelisenko commented 5 years ago

I started working on the iOS implementation, but running into several issues.

The flutter background isolate example does not run on iOS.

https://medium.com/flutter-io/executing-dart-in-the-background-with-flutter-plugins-and-geofencing-2b3e40a1a124

13:57:40.809 1 info flutter.tools Launching lib/main.dart on iPhone XS Max in debug mode...
13:57:49.494 2 info flutter.tools Running pod install...
13:57:50.791 3 info flutter.tools Running Xcode build...
13:57:51.973 4 info flutter.tools Xcode build done.                                            1.2s
13:57:52.914 5 info flutter.tools Failed to build iOS app
13:57:52.914 6 info flutter.tools Error output from Xcode build:
13:57:52.914 7 info flutter.tools ↳
13:57:52.914 8 info flutter.tools Could not build the application for the simulator.
13:57:52.914 9 info flutter.tools Error launching application on iPhone XS Max.
13:57:52.914 10 info flutter.tools     ** BUILD FAILED **
13:57:52.914 11 info flutter.tools 
13:57:52.914 12 info flutter.tools 
13:57:52.914 13 info flutter.tools Xcode's output:
13:57:52.914 14 info flutter.tools ↳
13:57:52.914 15 info flutter.tools     === BUILD TARGET geofencing OF PROJECT Pods WITH CONFIGURATION Debug ===
13:57:52.914 16 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:26:8: error: unknown type name 'FlutterPluginRegistrantCallback'
13:57:52.914 17 info flutter.tools     static FlutterPluginRegistrantCallback registerPlugins = nil;
13:57:52.914 18 info flutter.tools            ^
13:57:52.914 19 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:40:38: error: expected a type
13:57:52.914 20 info flutter.tools     + (void)setPluginRegistrantCallback:(FlutterPluginRegistrantCallback)callback {
13:57:52.914 21 info flutter.tools                                          ^
13:57:52.914 22 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:125:42: warning: 'center' is deprecated: first deprecated in iOS 7.0 - Please see CLCircularRegion [-Wdeprecated-declarations]
13:57:52.914 23 info flutter.tools       CLLocationCoordinate2D center = region.center;
13:57:52.914 24 info flutter.tools                                              ^
13:57:52.914 25 info flutter.tools     In module 'CoreLocation' imported from /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.h:6:
13:57:52.914 26 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/CoreLocation.framework/Headers/CLRegion.h:78:56: note: 'center' has been explicitly marked deprecated here
13:57:52.914 27 info flutter.tools     @property (readonly, nonatomic) CLLocationCoordinate2D center API_DEPRECATED("Please see CLCircularRegion", ios(4.0, 7.0), macos(10.7, 10.10)) API_UNAVAILABLE(tvos);
13:57:52.914 28 info flutter.tools                                                            ^
13:57:52.914 29 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:142:20: warning: 'setAllowsBackgroundLocationUpdates:' is only available on iOS 9.0 or newer [-Wunguarded-availability]
13:57:52.914 30 info flutter.tools       _locationManager.allowsBackgroundLocationUpdates = YES;
13:57:52.914 31 info flutter.tools                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 32 info flutter.tools     In module 'CoreLocation' imported from /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.h:6:
13:57:52.914 33 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/CoreLocation.framework/Headers/CLLocationManager.h:266:35: note: 'setAllowsBackgroundLocationUpdates:' has been explicitly marked partial here
13:57:52.914 34 info flutter.tools     @property(assign, nonatomic) BOOL allowsBackgroundLocationUpdates API_AVAILABLE(ios(9.0), watchos(4.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(tvos);
13:57:52.914 35 info flutter.tools                                       ^
13:57:52.914 36 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:142:20: note: enclose 'setAllowsBackgroundLocationUpdates:' in an @available check to silence this warning
13:57:52.914 37 info flutter.tools       _locationManager.allowsBackgroundLocationUpdates = YES;
13:57:52.914 38 info flutter.tools                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 39 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:220:74: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 40 info flutter.tools       NSMutableDictionary *callbackDict = [_persistentState dictionaryForKey:key];
13:57:52.914 41 info flutter.tools                                                                              ^~~
13:57:52.914 42 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 43 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:93:73: note: passing argument to parameter 'defaultName' here
13:57:52.914 44 info flutter.tools     - (nullable NSDictionary<NSString *, id> *)dictionaryForKey:(NSString *)defaultName;
13:57:52.914 45 info flutter.tools                                                                             ^
13:57:52.914 46 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:220:24: warning: incompatible pointer types initializing 'NSMutableDictionary *' with an expression of type 'NSDictionary<NSString *,id> * _Nullable' [-Wincompatible-pointer-types]
13:57:52.914 47 info flutter.tools       NSMutableDictionary *callbackDict = [_persistentState dictionaryForKey:key];
13:57:52.914 48 info flutter.tools                            ^              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 49 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:222:18: warning: incompatible pointer types assigning to 'NSMutableDictionary *' from 'NSDictionary *' [-Wincompatible-pointer-types]
13:57:52.914 50 info flutter.tools         callbackDict = @{};
13:57:52.914 51 info flutter.tools                      ^ ~~~
13:57:52.914 52 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:223:53: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 53 info flutter.tools         [_persistentState setObject:callbackDict forKey:key];
13:57:52.914 54 info flutter.tools                                                         ^~~
13:57:52.914 55 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 56 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:82:57: note: passing argument to parameter 'defaultName' here
13:57:52.914 57 info flutter.tools     - (void)setObject:(nullable id)value forKey:(NSString *)defaultName;
13:57:52.914 58 info flutter.tools                                                             ^
13:57:52.914 59 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:231:46: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 60 info flutter.tools       [_persistentState setObject:mapping forKey:key];
13:57:52.914 61 info flutter.tools                                                  ^~~
13:57:52.914 62 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 63 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:82:57: note: passing argument to parameter 'defaultName' here
13:57:52.914 64 info flutter.tools     - (void)setObject:(nullable id)value forKey:(NSString *)defaultName;
13:57:52.914 65 info flutter.tools                                                             ^
13:57:52.914 66 info flutter.tools     7 warnings and 2 errors generated.

If anyone knows how the above can be fixed, please let me know. In the meantime, I will working on a standalone audio_player_service plugin that does not use a background isolate, as it doesn't seem to be required for iOS.

Also, the audio_service project was created with an older ios template, which was also broken. I created a new repo and moved over the android code, which fixed it, so I recommend recreating this repo with the latest flutter plugin template.

ryanheise commented 5 years ago

Hi @alexelisenko and first thank you so much for trying to get the iOS side working.

I unfortunately don't own a Mac to test this on, however I did do a Google search for the missing type name "FlutterPluginRegistrantCallback" and found that apparently this type was only committed to git very recently: https://github.com/flutter/engine/commit/1ba32955066954f7fb5bbc793a4954b529a4a8c8#diff-dc093c279508589dcc672b1f75d69a11

So your two options would be:

  1. Check out an older version of the FlutterGeofencing repo; or
  2. Upgrade to the latest version of the Flutter engine.

It is probably better to do the latter since there's a chance you'll want to use the latest APIs.

First up, of course:

flutter upgrade

If the plugin still doesn't work, then you need to switch to a channel that is updated more regularly. First, you can try the beta channel which is what I personally use:

flutter channel beta

The beta channel is updated once a month, so I'd expect it should have this missing datatype since it was added last month. If it doesn't have the missing data type yet, then you can try:

flutter channel dev

You could then switch back to beta within a month after the missing datatype has had time to make its way into beta.

Regarding the project template, I agree and would love to update it, although would you like me to generate a Swift or Objective C template?

alexelisenko commented 5 years ago

@ryanheise switching to the beta channel resolved the error, thanks.

I think you should go with Objective-C since it reduces the complexity of setup for the background execution. The swift template actually uses both swift and obj-c by adding bridge headers, which complicates the setup and examples that are available for background execution.

I am currently flushing out all the iOS requirements in a standalone xcode project before integrating into the plugin.

docaohuynh commented 5 years ago

Waiting for this implementation 👍

ryanheise commented 5 years ago

@alexelisenko you've become a very popular person :-)

Regarding the iOS template, I'll try to update it tonight.

I watched an introductory video about Objective C recently, and while it's still a bit foreign to me, I might try my hand at filling in the basic plugin structure in a separate branch (it's bound to have syntax errors though since I don't have a Mac / Xcode to test it on). Swift looks a bit easier to me, but I'll trust you that the bridge headers add complexity.

alexelisenko commented 5 years ago

After playing around with the Flutter background Dart execution, I abandoned that approach for iOS to keep my project deadlines. The good news is I do have a working audio service for iOS (For Audio and Video files), which I will share here soon (have a few tweaks to make before making it public).

This does mean that I had to resort to using things like if(platform.isIOS){}, but this was honestly the fastest route for my project. Once I publish my version (which does not use background dart code), anyone who needs this functionality can at least use it, but I do hope to have time to merge my work with this plugin, I just didn't have enough time to fiddle with the background dart implementation.

@ryanheise We could potentially create a wrapper interface for my iOS plugin, without adding the background dart execution. One of the main reasons I wrote my own plugin for iOS, was that the audioplayer plugin that is typically used with audio_service was not feature complete. The audio_player_service plugin I wrote is basically the service and player rolled into one, since iOS does not require much for background audio playback, so the background dart execution added alot of code and moving parts that aren't needed.

I will post the link to the plugin here, hopefully in the next few days.

alexelisenko commented 5 years ago

Here is the iOS plugin: https://github.com/alexelisenko/audio_player_service

I would consider this in BETA. While it does work, it is lacking in documentation and thorough testing.

The example project does give you everything you need to use it, but it does not include examples of all the methods available, so some code reading will be required for some use cases.

Please keep in mind that I will be making updates to this repo, which may or may not be breaking changes.

EDIT I still intend to integrate this into audio_service, may need to have a different interface for iOS, or create the same interface with some of the methods doing nothing. Once Im done with my active project I will circle back to this, but for now, I need to get my project done :)

ryanheise commented 5 years ago

Excellent! I fully expect iOS to have a different, though partially overlapping, set of features, so it will be interesting to get an iOS developer's input to help shape the audio_service API.

Does your plugin handle button clicks on the headset to play/pause media playback, and if so, which part of the code is responsible for it? I'd be interested to know if this is just handled automatically, or whether iOS gives you the flexibility to handle these button clicks with your own callbacks. After a stack overflow search, I found this (https://stackoverflow.com/questions/9797562/iphone-headset-play-button-click) which gives me hope that it's possible.

alexelisenko commented 5 years ago

iOS uses whats called the CommandCenter to display media metadata and provide controls.

The plugin supports the following

togglePlayPauseCommand playCommand pauseCommand stopCommand nextTrackCommand previousTrackCommand changePlaybackPositionCommand (seeking from command center/ external devices)

The unimplemented commands available are:

[commandCenter.skipForwardCommand setEnabled:NO];
[commandCenter.skipBackwardCommand setEnabled:NO];
[commandCenter.enableLanguageOptionCommand setEnabled:NO];
[commandCenter.disableLanguageOptionCommand setEnabled:NO];
[commandCenter.changeRepeatModeCommand setEnabled:NO];
[commandCenter.seekForwardCommand setEnabled:NO];
[commandCenter.seekBackwardCommand setEnabled:NO];
[commandCenter.changeShuffleModeCommand setEnabled:NO];

// Rating Command
[commandCenter.ratingCommand setEnabled:NO];

// Feedback Commands
 // These are generalized to three distinct actions. Your application can provide
// additional context about these actions with the localizedTitle property in
// MPFeedbackCommand.
[commandCenter.likeCommand setEnabled:NO];
[commandCenter.dislikeCommand setEnabled:NO];
[commandCenter.bookmarkCommand setEnabled:NO];

The above can be added very easily.

I have currently tested this with various bluetooth headphones and over bluetooth audio in several cars

As far as what code is responsible, look at this file: https://github.com/alexelisenko/audio_player_service/blob/master/ios/Classes/AudioPlayer.m

The init method sets up the CommandCenter with the callbacks.

NOTE: The iOS simulator does not show the command center, so it has to be tested on a device, unfortunately.

ryanheise commented 5 years ago

So then when you click the headset button, does iOS itself maintain a play/pause toggle state and flip it on a click? That would be unfortunate, since what I'm really hoping for is to be able to just listen to button click events and allow the application to process them and manage its own state.

alexelisenko commented 5 years ago

Each of the command I listed requires a callback, which are implemented in the plugin. The state is managed by the plugin entirely, meaning it's the plugins job to decide what to do with the command events.

adriancmurray commented 5 years ago

Sorry, I’m on my phone and haven’t had a chance to inspect the plugin yet. Though I’m curious, it looks like skip controls are in but you mention no background excecution, can you create. Playlist with your plugin?

On Wed, Apr 3, 2019 at 8:19 AM alexelisenko notifications@github.com wrote:

Each of the command I listed require a callback, which are implemented. The state in managed by the plugin entirely, meaning its the plugins job to decide what to do with the command events.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ryanheise/audio_service/issues/10#issuecomment-479509881, or mute the thread https://github.com/notifications/unsubscribe-auth/AlhueJ6mGJY9NzTk-YETUtVu2-z3jVEAks5vdLh5gaJpZM4ZmGjT .

-- Michelle Murray - Manager

alexelisenko commented 5 years ago

@adriancmurray Yes, the plugin actually requires that you initialize a Queue of items (which can be just one item). The plugin does not automatically play the next item in the queue, although this could be added easily. Im currently playing the next item in the queue in the actual flutter project using the plugin by watching that playback state and running next as needed.

adriancmurray commented 5 years ago

This is excellent news! Thank you!

On Wed, Apr 3, 2019 at 8:25 AM alexelisenko notifications@github.com wrote:

@adriancmurray https://github.com/adriancmurray Yes, the plugin actually requires that you initialize a Queue of items (which can be just one item). The plugin does not automatically play the next item in the queue, although this could be added easily. Im currently playing the next item in the queue in the actual flutter project using the plugin by watching that playback state and running next as needed.

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/ryanheise/audio_service/issues/10#issuecomment-479512413, or mute the thread https://github.com/notifications/unsubscribe-auth/AlhueEax8PVjoRBrrLz0QmKtLT8F_6svks5vdLnRgaJpZM4ZmGjT .

-- Michelle Murray - Manager

alexelisenko commented 5 years ago

@adriancmurray There is one known bug with this though:

When supplying multiple items in the queue, any bluetooth devices that display the queue position, i.e. 2/10 etc, do not display the correct index, Im still working on a fix

This is related to not being able to easily go to the previous item, and having to reinit the queue, which screws up the metadata for external devices that rely on this data.

This does not affect devices that do not look for this data, or general playlist playback via command center

ryanheise commented 5 years ago

Ah, nice! So for example, on receiving a togglePlayPauseCommand, the application in theory has the power to decide based on runtime conditions to ignore the click without necessarily running the risk of falling out of sync with iOS's internal toggle state (because there is no iOS internal toggle state).

Well... It sounds like you have all of the features I would need for my own use case! :+1:

adriancmurray commented 5 years ago

Thanks for the heads up. Once I implement the plugin I’ll be sure to check out the bug and see what solutions I can come up with myself.

On Wed, Apr 3, 2019 at 8:29 AM alexelisenko notifications@github.com wrote:

@adriancmurray https://github.com/adriancmurray There is one known bug with this though:

When supplying multiple items in the queue, any bluetooth devices that display the queue position, i.e. 2/10 etc, do not display the correct index, Im still working on a fix

This is related to not being able to easily go to the previous item, and having to reinit the queue, which screws up the metadata for external devices that rely on this data.

This does not affect devices that do not look for this data, or general playlist playback via command center

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ryanheise/audio_service/issues/10#issuecomment-479514001, or mute the thread https://github.com/notifications/unsubscribe-auth/AlhueDoVFEGIGcmSLWR2ob7tiYmux4p2ks5vdLqrgaJpZM4ZmGjT .

-- Michelle Murray - Manager

alexelisenko commented 5 years ago

@ryanheise Yes, if the callback does not actually play or pause the Player, the command center will not flip the play/pause button state. The state is provided to the OS via the nowPlayingInfo property of the CommandCenter. This property is updated by the plugin whenever the plugin state changes.

ryanheise commented 5 years ago

I see, that is more or less the way it also works in Android, so it should fit nicely into the current plugin API.

I've just copied the latest iOS template across to audio_service and will look more in depth at your iOS code tomorrow.

hacker1024 commented 5 years ago

@alexelisenko Interesting. I'm actually going to publish my own audio player plugin soon, which is designed for queuing URLs. Perhaps, when I publish it, you could decouple you media player code and move it into there? (It's a really simple plugin, just all your usual media and queue management functions to implement).

That way you're one step closer to the way this plugin works.

alexelisenko commented 5 years ago

@hacker1024 What do you mean by queuing URLs?

hacker1024 commented 5 years ago

@alexelisenko The plugin maintains a list of audio URLs to stream and will play through them. It preloads upcoming queue items, which is what I couldn't get any other plugin to do.

EDIT: The Android implementation uses ExoPlayer's ConcatentatingMediaSource.

alexelisenko commented 5 years ago

@hacker1024 So if you pass it a list of URLs, it will download the, then load them from local filesystem?

hacker1024 commented 5 years ago

@alexelisenko Yes, but to RAM or a cache only. The downloads aren't made to be accessed.

ZheGuangZeng commented 5 years ago

Great Job Guys. Waiting...

adriancmurray commented 5 years ago

For those waiting for background audio plugins... might I suggest just writing it yourself. I was hoping for a plugin to be made and briefly used one mentioned on here earlier in the thread (as in I ran my own code on it for about 5 hours until I realized it wasn't going to offer what I needed). My needs are rather specific so I decided to code one myself in swift. It's not that difficult and I had never coded in swift up until writing my own plug in. This allows you a whole lot more freedom in how you implement the native APIs (and there are many for audio). I tried to alter some of the existing plugins for iOS but they all use Objective C, which kind of feels like reading a book written in the 1800's when you're used to languages like Dart/Javascript, so I used Swift. This also allows you to have a tighter coupling between your dart code and the native APIs. Hope this helps. Thanks to everyone who has been working on audio plugins. Cheers!

LoSunny commented 5 years ago

I don't know how to write a flutter plugin. Can you send some links about starting to code a flutter plugin? Also may you share your background audio plugin?

adriancmurray commented 5 years ago

It’s pretty straight forward and the way to do it is in the documentation. https://flutter.dev/docs/development/platform-integration/platform-channels

As for my plugin, it’s really tightly coupled to my own dart code. It skips over some functionality that some people might need and focuses on my app’s needs, meaning it’s not designed as a one size fits all service. I’m managing my playlist in ways that most people won’t want. It also skips over Android as I just don’t want to write-in / learn another language right now.

That being said I can give some pointers on how to make your own audio plugin. You might want to use the Audioplayer plugin for a scaffold of your dart code. Or at least as a method to see which initial functionality you’d like to implement in your dart code. If you’re going to write the iOS part in swift then I suggest looking at the code in the multiimagepicker plugin or the vibrate plugins as they were written in swift and you can see how they implement sending callbacks to swift. For iOS, I’d use the AVqueueplayer or AVplayer classes for playing the audio and you will also need to use MPNowPlayingInfo for getting information to the control center (sorry if these aren’t the exact names, I’m on my phone and only going off memory). There are several options you can use for the control center such as skipping 15 seconds, skip, fast forward, rewind, bookmark, like, etc. make sure you make your channel available to your entire swift class because you’ll need it for sending back a callback for the current position and currently playing track. This is done as an argument when you have the instance called from the plugin framework. Then you can use the “invokeMethod” call whenever you want. To get the current position you need to use a periodic timer function that checks the position and sends the current position info back to dart with the callback I mentioned before. I hope all this helps. Cheers!

On Tue, Apr 9, 2019 at 6:07 AM LoYungSum notifications@github.com wrote:

I don't know how to write a flutter plugin. Can you send some links about starting to code a flutter plugin? Also may you share your background audio plugin?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ryanheise/audio_service/issues/10#issuecomment-481223817, or mute the thread https://github.com/notifications/unsubscribe-auth/AlhueAQOJ20dmFnE7A7i8_fNDCihYfaNks5vfIJ_gaJpZM4ZmGjT .

-- Michelle Murray - Manager

ryanheise commented 5 years ago

A question for iOS developers: would it in fact be possible to implement the iOS side of this plugin without using the Flutter background execution API?

On Android, we must use the background execution API to create a Dart isolate that lives within a Service component, separate from the default Dart isolate that lives within the Activity component because the Android allows the Activity (UI) to be stopped and even destroyed and cleared from memory while the background Service should be able to live on playing audio.

My understanding is that in iOS, an application isn't required to be segmented into independent foreground and background components like this, and that the background execution API would only be needed for cases where an external event outside the app needs to trigger Dart code in the app to run(). So my hope is that we could simplify our efforts for the iOS side by just implementing it using standard techniques that most iOS Flutter plugin developers will already be familiar with. If this is possible, then "theoretically" we wouldn't even need to spawn a separate isolate for the background task, however* I think it would still be a good idea to spawn one just to prevent users of this plugin from relying on shared memory on iOS only to find that their code does not work on Android.

(*) Perhaps one potential use case for audio_service could be if we want the play/pause button to start up the app if it's not already running and begin playing audio. audio_service doesn't support this use case yet anyway, and I also don't know if this is a capability of iOS (it is a capability of Android), but sounds like the sort of thing that could require the background execution API.

adriancmurray commented 5 years ago

From what I’ve gathered in making my own plugin is that no, you don’t need to do flutter background execution at all. Mine doesn’t. The iOS code manages all of that for you with the AVPlayer class. The dart code then reacts to what your audio is doing on the device. You could do it all with background execution and then simply call the MPNowPlayingInfo and which file is playing in the AVPlayer. That could make it so your playlist is actually handled completely in dart giving you more common ground for the android side. I chose not to go this way because in the geofencing plugin they mention that there is no guarantee that the isolate will remain in the background. That, and figuring out background execution in swift seemed like a bit more than I wanted to chew.

On Fri, Apr 12, 2019 at 7:38 AM ryanheise notifications@github.com wrote:

A question for iOS developers: would it in fact be possible to implement the iOS side of this plugin without using the Flutter background execution API?

On Android, we must use the background execution API to create a Dart isolate that lives within a Service component, separate from the default Dart isolate that lives within the Activity component because the Android allows the Activity (UI) to be stopped and even destroyed and cleared from memory while the background Service should be able to live on playing audio.

My understanding is that in iOS, an application isn't required to be segmented into independent foreground and background components like this, and that the background execution API would only be needed for cases where an external event outside the app needs to trigger Dart code in the app to run(). So my hope is that we could simplify our efforts for the iOS side by just implementing it using standard techniques that most iOS Flutter plugin developers will already be familiar with. If this is possible, then "theoretically" we wouldn't even need to spawn a separate isolate for the background task, however* I think it would still be a good idea to spawn one just to prevent users of this plugin from relying on shared memory on iOS only to find that their code does not work on Android.

(*) Perhaps one potential use case for audio_service could be if we want the play/pause button to start up the app if it's not already running and begin playing audio. audio_service doesn't support this use case yet anyway, and I also don't know if this is a capability of iOS (it is a capability of Android), but sounds like the sort of thing that could require the background execution API.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ryanheise/audio_service/issues/10#issuecomment-482577725, or mute the thread https://github.com/notifications/unsubscribe-auth/AlhueOjwh378pXKLHcmvK8FIaJqKIQBlks5vgIxIgaJpZM4ZmGjT .

-- Michelle Murray - Manager

ryanheise commented 5 years ago

Thanks for the informative answer, @adriancmurray . I am certainly looking at this from the perspective of how audio_service works, so the goal would be to let users of this plugin write all of their own audio logic in Dart so that they are in complete control of whether they want to play music or play text to speech or even play synthesized audio. The audio_plugin itself stays out of the business of actually playing the audio but will provide "everything else":

Perhaps then in the iOS implementation, we would not need to interface with AVPlayer, but we would need to interface with CommandCenter. Given these goals, do we still see any need for the background execution API for iOS?

Dekkee commented 5 years ago

Your idea with splitting service abstraction and player abstraction sounds good. In my case MediaPlayer is not enough and i should use ExoPlayer and service abstraction works quite good. But in iOS background execution released via checkbox in xcode. And then you just operate with some static objects.

I'm not mobile developer at all. I come here from web-dev. But I contribute iOS implementation to https://github.com/thyagoluciano/flutter_radio. There much less functionality then in audio_service, but I investigate some cases like background audio. But you still need to update command center (lock screen) manually: https://github.com/thyagoluciano/flutter_radio/blob/ff2bd1b85ae1064f4582ec1c1a93f5af3fcf6f11/ios/Classes/FlutterRadioPlugin.m#L312

I`m suffering from Objective-C syntax. Impressed by https://gitlab.com/exitlive/music-player I want to refactor all my iOS code to swift. And there is a chance to split realization of service and player.

konstantin-mohin commented 5 years ago

@adriancmurray @adriancmurray @alexelisenko @hacker1024 @ryanheise

thank you for contributing to this task,

I just want to know if someone is working on this important task ?

ryanheise commented 5 years ago

It would be great to get the ball rolling, even in a small way, but it will take someone with sufficient motivation and sufficient time to overcome the initial barrier to entry.

Supposing that we had a basic foundation in place with a minimal amount of iOS code required to get Stage 0 working, I think then it would be a lot easier for other iOS contributors to contribute a feature or two here and there, rather than one person doing it all.

So in terms of overcoming the initial barrier to entry, would any iOS developer be interested in working with me to get Stage 0 in place?

Dekkee commented 5 years ago

Im interested to implement stage 0, but Im not mobile developer and super busy for several months. Currently I can help only by sharing my iOS implementation of background audio in another plugin: https://github.com/thyagoluciano/flutter_radio/blob/ff2bd1b85ae1064f4582ec1c1a93f5af3fcf6f11/ios/Classes/FlutterRadioPlugin.m#L312

ryanheise commented 5 years ago

Thanks for that. Yes, I think there's now probably enough code shared by you and others above which would be really helpful to someone implementing this in the iOS side of the plugin.

But understandably we all have other priorities at the moment.

While I personally also have other priorities at the moment, and to make matters worse, I don't have a Mac or an iPhone, I do at least now have MacInCloud which brings me one step closer to being able to work on this.

konstantin-mohin commented 5 years ago

I have two iPhones of different models so I could test

hacker1024 commented 5 years ago

While I personally also have other priorities at the moment, and to make matters worse, I don't have a Mac or an iPhone, I do at least now have MacInCloud which brings me one step closer to being able to work on this.

I don't either, but I've built a Hackintosh which can run the iOS Simulator. It's fairly straightforward. If you have a Linux computer, there's an easy way to create a VM: foxlet/macOS-Simple-KVM.

I've also managed to put together a fully functional iPhone SE with parts I found at my local E-Waste recycling bin - I recommend looking at one if you can.

Anyway, I'd be very happy to test things on iOS, since I really want to get my Pandora client working on iPhones.

hacker1024 commented 5 years ago

Now that I've got a working Hackintosh, I'm going to learn iOS development and try and implement the iOS side.

ryanheise commented 5 years ago

@hacker1024 , cool! You might also want to check with @alexelisenko as he might have also at least done some initial sketching of an iOS plan. The key points are that the iOS implementation should not need to do any Flutter background execution. However, to maintain compatibility with the Android side which does need this, the iOS side can just directly use the Isolates API to start the background task within a new isolate. The plugin's "start" method can use the various iOS APIs to initialise things for audio, such as activating the AVAudioSession and setting callbacks on the MPRemoteCommandCenter.

We should eventually document the separation of concerns between this plugin and other audio plugins that can interoperate with this one: this plugin is reponsible for setting up the AVAudioSession on iOS and the MediaSession on Android. Other plugins which just play audio should not set these things themselves. Happily, this seems to be the current situation anyway with other audio plugins I've checked, but it could help to make it clear where the boundary between responsibilities is.

ryanheise commented 5 years ago

@hacker1024 (and @alexelisenko ?) I now also am at the stage I have the need for iOS and would be happy to team up with you on this. I don't have full-time access to a Mac, but I should have sporadic access to one about 3 times a week which is far better than nothing, and urgency will help push me to make something happen

(I do have MacInCloud, but the affordable plan I'm on doesn't give me root access which is needed to get over a flutter bug that prevents me from seeing the flutter logs and doing hot reload rendering it useless for hard development.)

In case you haven't done any work on this yet, I'd be happy to start by laying some structural foundations, setting up an isolate for the iOS side. If you'd like to collaborate, I'll set up a Trello board with a plan.

alexandergottlieb commented 5 years ago

@ryanheise I'd also be happy to help out on the iOS side if you need more hands/an extra mac. I have limited experience developing for iOS but keen to learn.

ryanheise commented 5 years ago

Thanks, @alexandergottlieb , for the offer to help! I'll try my hand at implementing Stage 0 over the coming days, and hopefully build a base that others such as yourself can then contribute on top of.

yringler commented 4 years ago

That's awesome! This requirement has come up recently for a project I'm working on, and next week or so I also should be available to work on this. I have a mac,, but no iOS knowledge (yet)

ryanheise commented 4 years ago

Quick update: I've been hacking away at the iOS side for about a day and a half, and have got the isolates working, and am able to connect, and start audio playback. I'm going to need some help with the trickier Objective C code, so I plan to push a new branch with what I've done so far and see if some of us can work together.

ryanheise commented 4 years ago

:gift: :gift: :gift:

I've just committed an initial iOS implementation to the iosdev branch!

If you think you can help out, please take a look at the TODO comments I've scattered throughout for an idea of what needs to be done. Now that there's actually a place to put the pieces of code, and given the many comments and links above to the pieces of code we need, hopefully this should actually be relatively straightforward.

If you'd like to tackle any of the TODO items, I'd suggest either commenting here or opening a new issue to announce your intention to work on it so that others know not to work on it, and then submit a pull request to the iosdev branch. I'll merge this onto master once enough of the core features are working.

alexandergottlieb commented 4 years ago

Thanks! I've opened a pull request for initialising AVAudioSession and MPRemoteCommandCenter. Just muddling through docs and tutorials so I hope it's along the right lines!

ryanheise commented 4 years ago

Great, thanks for the contribution! I've left some comments on the pull request. One of the links from above which should be helpful in relation to using these APIs is Alexander's project: https://github.com/alexelisenko/audio_player_service .