ryanheise / audio_service

Flutter plugin to play audio in the background while the screen is off.
796 stars 474 forks source link

Problem in backgroud after update phone to iOS 16.2 #990

Closed csacchetti closed 1 year ago

csacchetti commented 1 year ago

Documented behaviour

There are no special APIs

Actual behaviour

After I updated my iPhone to iOS 16.2 I encountered a problem that I don't have with another iPhone that has iOS 14 installed. Songs that used to go in the background now only go in the foreground. As a player I use just audio. This problem does not occur with Android. I don't see any programming problems otherwise the problem would also occur with the iPhone that has iOS 14 installed, whereas with this iPhone everything works correctly, as before. In the console I have no error message apart from the one it has always given but has never been a problem.

[NowPlaying] [MRNowPlaying] Ignoring setPlaybackState because application does not contain entitlement com.apple.mediaremote.set-playback-state for platform

Now I would say this has been added but I don't remember if it was there before.

[Entitlements] MSVEntitlementUtilities - Process Runner PID[572] - Group: (null) - Entitlement: com.apple.mediaremote.external-artwork-validation - Entitled: NO - Error: (null)

With another simpler app with iOS 16.2 I have the same problem, and I also noticed an error message that may help find the problem

[BackgroundTask] Background Task 9 ("Flutter debug task"), was created over 30 seconds ago. In applications running in the background, this creates a risk of termination. Remember to call UIApplication.endBackgroundTask(_:) for your task in a timely manner to avoid this.

So now the audio can be heard perfectly in the foreground but if I put the app in the background when changing audio files the audio stops. This problem wasn't there before, and the fact that it works perfectly with iOS 14 suggests a problem with the iOS 16.2 update.

Translated with www.DeepL.com/Translator (free version)

Minimal reproduction project

Official example: main.dart

Reproduction steps

The track is loaded automatically thanks to a list of tracks in sequence

Output of flutter doctor


    • Flutter version 3.3.10 on channel stable at /Users/carlosacchetti/Developer/flutter
    • Upstream repository https://github.com/flutter/flutter.git
    • Framework revision 135454af32 (5 weeks ago), 2022-12-15 07:36:55 -0800
    • Engine revision 3316dd8728
    • Dart version 2.18.6
    • DevTools version 2.15.0

[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
    • Android SDK at /Users/carlosacchetti/Library/Android/sdk
    • Platform android-33, build-tools 33.0.0
    • Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
    • Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
    • All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS (Xcode 14.2)
    • Xcode at /Applications/Xcode.app/Contents/Developer
    • Build 14C18
    • CocoaPods version 1.11.2

[✓] Chrome - develop for the web
    • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome

[✓] Android Studio (version 2020.3)
    • Android Studio at /Applications/Android Studio.app/Contents
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart
    • Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)

[✓] IntelliJ IDEA Community Edition (version 2021.1.1)
    • IntelliJ at /Applications/IntelliJ IDEA CE.app
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart

[✓] VS Code (version 1.74.3)
    • VS Code at /Applications/Visual Studio Code.app/Contents
    • Flutter extension version 3.56.0

[✓] Connected device (4 available)
    • SM G780G (mobile)  • RF8T30H0K2Y               • android-arm64  • Android 13 (API 33)
    • iPhone CS (mobile) • 00008110-000A5CD42682801E • ios            • iOS 16.2 20C65
    • macOS (desktop)    • macos                     • darwin-x64     • macOS 12.6.2 21G320 darwin-x64
    • Chrome (web)       • chrome                    • web-javascript • Google Chrome 108.0.5359.124
    ! Error: iPhone CS is busy: Fetching debug symbols for iPhone CS. Xcode will continue when iPhone CS is finished. (code -10)
    ! Error: Apple Watch di Carlo needs to connect to determine its availability. Check the connection between the device and its
      companion iPhone, and the connection between the iPhone and Xcode. Both devices may also need to be restarted and unlocked.
      (code 1)

[✓] HTTP Host Availability
    • All required HTTP hosts are available

• No issues found!```
### Devices exhibiting the bug
iPhone 13 and iOS 16.2
ryanheise commented 1 year ago

Thank you for the report, but can you please fill in the sections correctly for the minimal reproduction project and the reproduction steps?

csacchetti commented 1 year ago

After real tests I think the problem is due to a conflict with Firebase/Push notifications and Audio_Service isolate. The problem was not in my programming of Audio_Service because otherwise on the phone with iOS 14 it would not have gone. So I focused on possible conflicts. The only thing I had changed was an operation on the push notifications Topics by bringing them into the initial main. Now this was probably leading to a conflict between Firebase_messaging and the other Audio_service isolate. So I removed this change. With this change, the app started working normally again. The fact that it worked the same with iOS 14 may be related to the fact that the phone with iOS 14 is slower and perhaps the two processes with different timings did not conflict. So iOS 16.2 was not the cause. This is what I did and I hope it will be useful to other people.

I also added in the xCode Entitlements "Background process" to avoid an error message, but I don't think this is related to this problem because Audio Service only requires that Background Audio is enabled.

ryanheise commented 1 year ago

Glad you sorted out the issue in your app. I'll close this for now, and if you later find something pointing to a bug in audio_service itself, please comment below or post a new issue.

csacchetti commented 1 year ago

Unfortunately, the problem has returned to me. I tried to check whether the problem could be with the iPhone 13, as everything was working properly in the iPhone 12 with iOS 14. To do this I took an old iphone x that had iOS 14 installed and tried installing the app and everything worked perfectly. Then I upgraded the iphone x to iOS 16.2 and the problem returned. So I think it's safe to say that with iOS 16.2 changes have been made that limit the background audio. I tried looking at the various support sites and found this article: https://developer.apple.com/forums/thread/717701 I could use some help @ryanheise because I really don't know where to put my hands. This thing is blocking me from releasing the app which is now almost complete.

csacchetti commented 1 year ago

One more item if it helps. Looking at the console when I load the app on iOS 14 or iOS 16.2 the only difference I notice is this message at the end of the Firebase setup. However, I looked at the supports and found nothing. Maybe @ryanheise means something to you.

[TraitCollection] Class CKBrowserSwitcherViewController overrides the -traitCollection getter, which is not supported. If you're trying to override traits, you must use the appropriate API.

ryanheise commented 1 year ago

I have reopened the issue, but I return to my original comment: Can you edit your issue to fill in the sections correctly? You'll either need to investigate/solve the problem yourself, or make it so that it is possible for other people to carry on the investigation based on the contents of your bug report (which is currently not possible).

csacchetti commented 1 year ago

Here is the example demonstrating the bug with iOS 16.2 (or perhaps also with 16.0 but I have not tested this)

First of all, download the example given for the audio_service plugin from GitHub. The code below is that of the main of the example given. I have only added the final part where I put the explanations to reproduce the bug and I added in the construct of AudioPlayerHandler() the function _myListenVoiceState();

Of course, the quickest thing is to copy and paste the entire file I have put here complete.

// ignore_for_file: public_member_api_docs

// FOR MORE EXAMPLES, VISIT THE GITHUB REPOSITORY AT:
//
//  https://github.com/ryanheise/audio_service
//
// This example implements a minimal audio handler that renders the current
// media item and playback state to the system notification and responds to 4
// media actions:
//
// - play
// - pause
// - seek
// - stop
//
// To run this example, use:
//
// flutter run

import 'dart:async';

import 'package:audio_service/audio_service.dart';
import 'package:audio_service_example/common.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:just_audio/just_audio.dart';
import 'package:rxdart/rxdart.dart';

// You might want to provide this using dependency injection rather than a
// global variable.
late AudioHandler _audioHandler;

Future<void> main() async {
  _audioHandler = await AudioService.init(
    builder: () => AudioPlayerHandler(),
    config: const AudioServiceConfig(
      androidNotificationChannelId: 'com.ryanheise.myapp.channel.audio',
      androidNotificationChannelName: 'Audio playback',
      androidNotificationOngoing: true,
    ),
  );
  runApp(const MyApp());
}

class MyApp extends StatelessWidget {
  const MyApp({Key? key}) : super(key: key);

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Audio Service Demo',
      theme: ThemeData(primarySwatch: Colors.blue),
      home: const MainScreen(),
    );
  }
}

class MainScreen extends StatelessWidget {
  const MainScreen({Key? key}) : super(key: key);

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('Audio Service Demo'),
      ),
      body: Center(
        child: Column(
          mainAxisAlignment: MainAxisAlignment.center,
          children: [
            // Show media item title
            StreamBuilder<MediaItem?>(
              stream: _audioHandler.mediaItem,
              builder: (context, snapshot) {
                final mediaItem = snapshot.data;
                return Text(mediaItem?.title ?? '');
              },
            ),
            // Play/pause/stop buttons.
            StreamBuilder<bool>(
              stream: _audioHandler.playbackState
                  .map((state) => state.playing)
                  .distinct(),
              builder: (context, snapshot) {
                final playing = snapshot.data ?? false;
                return Row(
                  mainAxisAlignment: MainAxisAlignment.center,
                  children: [
                    _button(Icons.fast_rewind, _audioHandler.rewind),
                    if (playing)
                      _button(Icons.pause, _audioHandler.pause)
                    else
                      _button(Icons.play_arrow, _audioHandler.play),
                    _button(Icons.stop, _audioHandler.stop),
                    _button(Icons.fast_forward, _audioHandler.fastForward),
                  ],
                );
              },
            ),
            // A seek bar.
            StreamBuilder<MediaState>(
              stream: _mediaStateStream,
              builder: (context, snapshot) {
                final mediaState = snapshot.data;
                return SeekBar(
                  duration: mediaState?.mediaItem?.duration ?? Duration.zero,
                  position: mediaState?.position ?? Duration.zero,
                  onChangeEnd: (newPosition) {
                    _audioHandler.seek(newPosition);
                  },
                );
              },
            ),
            // Display the processing state.
            StreamBuilder<AudioProcessingState>(
              stream: _audioHandler.playbackState
                  .map((state) => state.processingState)
                  .distinct(),
              builder: (context, snapshot) {
                final processingState =
                    snapshot.data ?? AudioProcessingState.idle;
                return Text(
                    "Processing state: ${describeEnum(processingState)}");
              },
            ),
          ],
        ),
      ),
    );
  }

  /// A stream reporting the combined state of the current media item and its
  /// current position.
  Stream<MediaState> get _mediaStateStream =>
      Rx.combineLatest2<MediaItem?, Duration, MediaState>(
          _audioHandler.mediaItem,
          AudioService.position,
          (mediaItem, position) => MediaState(mediaItem, position));

  IconButton _button(IconData iconData, VoidCallback onPressed) => IconButton(
        icon: Icon(iconData),
        iconSize: 64.0,
        onPressed: onPressed,
      );
}

class MediaState {
  final MediaItem? mediaItem;
  final Duration position;

  MediaState(this.mediaItem, this.position);
}

/// An [AudioHandler] for playing a single item.
class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
  static final _item = MediaItem(
    id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
    album: "Science Friday",
    title: "A Salute To Head-Scratching Science",
    artist: "Science Friday and WNYC Studios",
    duration: const Duration(milliseconds: 5739820),
    artUri: Uri.parse(
        'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
  );

  final _player = AudioPlayer();

  /// Initialise our audio handler.
  AudioPlayerHandler() {
    // I have load here the function I will use to demonstrate the bug
    _myListenVoiceState();

    // So that our clients (the Flutter UI and the system notification) know
    // what state to display, here we set up our audio handler to broadcast all
    // playback state changes as they happen via playbackState...
    _player.playbackEventStream.map(_transformEvent).pipe(playbackState);
    // ... and also the current media item via mediaItem.
    mediaItem.add(_item);

    // Load the player.
    _player.setAudioSource(AudioSource.uri(Uri.parse(_item.id)));
  }

  // In this simple example, we handle only 4 actions: play, pause, seek and
  // stop. Any button press from the Flutter UI, notification, lock screen or
  // headset will be routed through to these 4 methods so that you can handle
  // your audio playback logic in one place.

  @override
  Future<void> play() => _player.play();

  @override
  Future<void> pause() => _player.pause();

  @override
  Future<void> seek(Duration position) => _player.seek(position);

  @override
  Future<void> stop() => _player.stop();

  /// Transform a just_audio event into an audio_service state.
  ///
  /// This method is used from the constructor. Every event received from the
  /// just_audio player will be transformed into an audio_service state so that
  /// it can be broadcast to audio_service clients.
  PlaybackState _transformEvent(PlaybackEvent event) {
    return PlaybackState(
      controls: [
        MediaControl.rewind,
        if (_player.playing) MediaControl.pause else MediaControl.play,
        MediaControl.stop,
        MediaControl.fastForward,
      ],
      systemActions: const {
        MediaAction.seek,
        MediaAction.seekForward,
        MediaAction.seekBackward,
      },
      androidCompactActionIndices: const [0, 1, 3],
      processingState: const {
        ProcessingState.idle: AudioProcessingState.idle,
        ProcessingState.loading: AudioProcessingState.loading,
        ProcessingState.buffering: AudioProcessingState.buffering,
        ProcessingState.ready: AudioProcessingState.ready,
        ProcessingState.completed: AudioProcessingState.completed,
      }[_player.processingState]!,
      playing: _player.playing,
      updatePosition: _player.position,
      bufferedPosition: _player.bufferedPosition,
      speed: _player.speed,
      queueIndex: event.currentIndex,
    );
  }

  //THIS IS THE CODE I ADDED TO DEMONSTRATE THE BUG THAT OCCURS WITH IOS 16.2
  // (I AM NOT SURE IF IT STARTS WITH 16.0 BUT I NOTICED IT FROM 16.2)

  // To check this, go to the end of the song and go back 10 seconds by clicking
  // on the back button and then start play. If you are not in the background
  // the song once it reaches the end will resume normally, whereas if you start
  // play and go into the background the song if you are on iOS 14 will behave correctly
  // and restart, whereas if you are on iOS 16.2 it will freeze.

  bool _onlyOne = true;

  void sequenceLogic() {
    MediaItem myItem = MediaItem(
      id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
      album: "Science Saturday",
      title: "A Salute To Head-Scratching Science",
      artist: "Science Friday and WNYC Studios",
      duration: const Duration(milliseconds: 5739820),
      artUri: Uri.parse(
          'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
    );
    mediaItem.add(myItem);
    _player.setAudioSource(AudioSource.uri(Uri.parse(myItem.id)));
    _audioHandler.play;
    _onlyOne = true;
  }

  void _myListenVoiceState() {
    // Listen to errors during playback.
    _player.playerStateStream.listen((event) async {
      switch (event.processingState) {
        case ProcessingState.completed:
          if (_onlyOne) {
            _onlyOne = false;
            sequenceLogic();
          }
          // }
          break;
        case ProcessingState.idle:
          break;
        case ProcessingState.loading:
          break;
        case ProcessingState.ready:
          break;
        case ProcessingState.buffering:
          break;
      }
    });
  }
}
ryanheise commented 1 year ago

I'm re-closing this. Rather than explain what the instructions asked you to do, please open a new issue, and that way you will be able to see the instructions and know what to do.

rjgpereira commented 1 year ago

@csacchetti I had a similar issue. I'm mimicking the playlist behaviour but not using the ConcatenatingAudioSource, just switching tracks manually calling player.setAudioSource() with the proper index, and listening to the "completed" playbackState in order to skip to the next index. Since I was using the player's 'LoopMode.off', after the 'completed' state was emitted, an 'idle' state was emitted as well. I think broadcasting this 'idle' state in iOS 16 might close the audio session. I don't know if this is exactly what's happening, and I can't explain why it didn't occur in the previous iOS versions. But this fixed it for me.

Try to broadcast AudioProcessingState.ready when you get a ProcessingState.idle, so that the OS keep your session alive while you switch tracks:

processingState: { ProcessingState.idle: Platform.isIOS ? AudioProcessingState.ready : AudioProcessingState.idle, ProcessingState.loading: AudioProcessingState.loading, ProcessingState.buffering: AudioProcessingState.buffering, ProcessingState.ready: AudioProcessingState.ready, ProcessingState.completed: AudioProcessingState.completed, }[_player.processingState]!,

ryanheise commented 1 year ago

@rjgpereira thank you for your input, but maybe you didn't see my comment just above which was that I have now closed this issue in favour of opening a new issue, which is where new comments can most helpfully be shared.

rjgpereira commented 1 year ago

oh sorry about that. I did miss it

csacchetti commented 1 year ago

Thank you, your solution works for me too. For the time being, until the problem in the plugin is solved, it can be a good workaround. I hope @ryanheise can look into the problem and make a fix that makes the plugin compatible for iOS 16 as well.

I recommend, as @ryanheise told you to post this fix in the new issue I created on his advice so it can be read by more people and help others. Here's the link: https://github.com/ryanheise/audio_service/issues/993

zoozobib commented 1 year ago

@csacchetti I had a similar issue. I'm mimicking the playlist behaviour but not using the ConcatenatingAudioSource, just switching tracks manually calling player.setAudioSource() with the proper index, and listening to the "completed" playbackState in order to skip to the next index. Since I was using the player's 'LoopMode.off', after the 'completed' state was emitted, an 'idle' state was emitted as well. I think broadcasting this 'idle' state in iOS 16 might close the audio session. I don't know if this is exactly what's happening, and I can't explain why it didn't occur in the previous iOS versions. But this fixed it for me.

Try to broadcast AudioProcessingState.ready when you get a ProcessingState.idle, so that the OS keep your session alive while you switch tracks:

processingState: { ProcessingState.idle: Platform.isIOS ? AudioProcessingState.ready : AudioProcessingState.idle, ProcessingState.loading: AudioProcessingState.loading, ProcessingState.buffering: AudioProcessingState.buffering, ProcessingState.ready: AudioProcessingState.ready, ProcessingState.completed: AudioProcessingState.completed, }[_player.processingState]!,

thanks u help . it resolved my current issue .
This is same issue in my project on IOS 16.2 , i almost only rebuild my code using the ConcatenatingAudioSource for a minutes ago ; but it working normal now , thank u very much @rjgpereira

ryanheise commented 1 year ago

It is strange how people will post a new comment as if they didn't even read the comment immediately preceding it. Please don't do that, since the immediately preceding comment was asking you to please not reply here.