bluefireteam / audioplayers

A Flutter package to play multiple audio files simultaneously (Android/iOS/web/Linux/Windows/macOS)
https://pub.dartlang.org/packages/audioplayers
MIT License
1.98k stars 843 forks source link

[Android 7 to 10] Paused audio of playerA resumes automatically after audio of playerB completes playing. #1329

Open JohnnyRainbow81 opened 1 year ago

JohnnyRainbow81 commented 1 year ago

Current behaviour: I have 2 audioplayers with 2 different sounds, playerA and playerB. When I play playerA's sound, pause it, then play playerB's sound and it completes, player A's sound automatically gets resumed and plays until completed.

Expected behaviour: When playerA pauses, it should remain paused when playerB completes playing.

[Edit] Until now I can only say that the bug is Android-exclusive! Besides my old Android 7 device I just tested the code above on a later Android 10 phone and the bug still exists.

On my iPhone 8 with iOS 16 I get the expected behavior, everything works fine there. Here is a minimal code example where I kind of replicated my architecture. In my main codebase there also is a ViewModel, but I left it out due to simplicity.

Here's the git: https://github.com/JohnnyRainbow81/audio_players_test.git

import 'dart:async';
import 'package:audioplayers/audioplayers.dart';
import 'package:flutter/material.dart';
import 'package:flutter_cache_manager/flutter_cache_manager.dart';

void initializeOSDependentAudio() {
  final AudioContext audioContext = AudioContext(
    iOS: AudioContextIOS(
      defaultToSpeaker: true,
      category: AVAudioSessionCategory.ambient,
      options: [AVAudioSessionOptions.defaultToSpeaker, AVAudioSessionOptions.duckOthers],
    ),
    android: AudioContextAndroid(
      isSpeakerphoneOn: true,
      stayAwake: true,
      contentType: AndroidContentType.speech,
      usageType: AndroidUsageType.media,
      audioFocus: AndroidAudioFocus.gainTransientExclusive,
    ),
  );

  AudioPlayer.global.setGlobalAudioContext(audioContext);

  AudioPlayer.global.changeLogLevel(LogLevel.info);
}
void main() {
  WidgetsFlutterBinding.ensureInitialized();
  initializeOSDependentAudio();

  runApp(const MyApp());
}

class MyApp extends StatelessWidget {
  const MyApp({super.key});

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Flutter Demo',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: const AudioApp(),
    );
  }
}

class AudioApp extends StatefulWidget {
  const AudioApp({super.key});

  @override
  State<AudioApp> createState() => _AudioAppState();
}

class _AudioAppState extends State<AudioApp> {
  @override
  Widget build(BuildContext context) {
    return Scaffold(
        body: Center(
            child: Column(
      mainAxisAlignment: MainAxisAlignment.center,
      children: const [
        PlayButton("613be3bf-bbc1-4ab9-911a-c1758a50c6d8"),
        PlayButton("cce9519c-5b68-46d2-a9de-12d21dac274c")
      ],
    )));
  }
}

class PlayButton extends StatefulWidget {
  final String id;

  const PlayButton(this.id, {super.key});

  @override
  State<PlayButton> createState() => _PlayButtonState();
}

class _PlayButtonState extends State<PlayButton> {
  PlayerState? state;
  StreamSubscription<PlayerState>? subscription;
  late AudioData audioData;

  @override
  void initState() {
    super.initState();
    init();
  }

  Future<void> init() async {
    audioData = await AudioService.instance.prepareAudio(widget.id);
    subscription = audioData.stateStream?.listen((event) {
      state = event;
      setState(() {});
    });
  }

  @override
  void dispose() {
    subscription?.cancel();
    super.dispose();
  }

  void togglePlay() {
    switch (state) {
      case PlayerState.stopped:
      case PlayerState.completed:
      case PlayerState.paused:
        AudioService.instance.play(widget.id);
        break;
      case PlayerState.playing:
        AudioService.instance.pause(widget.id);
        break;
      default:
        AudioService.instance.play(widget.id);
    }

    //  setState(() {});
  }

  IconData _getIcon() {
    switch (state) {
      case PlayerState.stopped:
      case PlayerState.completed:
      case PlayerState.paused:
        return Icons.play_arrow;
      case PlayerState.playing:
        return Icons.pause;
      default:
        return Icons.play_arrow;
    }
  }

  @override
  Widget build(BuildContext context) {
    return IconButton(onPressed: togglePlay, icon: Icon(_getIcon()));
  }
}

class AudioService {
  AudioService._();
  static final AudioService _instance = AudioService._();

  static AudioService get instance => _instance;
  final String baseUrl = "https://speakyfox-api-qa.herokuapp.com/api/v1/files";

  final Map<String, AudioData> _audioDatas = {};
  final Map<String, AudioPlayer> _audioPlayers = {};

  Future<AudioData> prepareAudio(String id) async {
    late AudioPlayer audioPlayer;

    if (!_audioPlayers.containsKey(id)) {
      audioPlayer = AudioPlayer(playerId: id);
      audioPlayer.setReleaseMode(ReleaseMode.release);
      audioPlayer.setPlayerMode(PlayerMode.mediaPlayer);
      audioPlayer.setVolume(1);
    }

    String path = (await DefaultCacheManager().getSingleFile(_url(id))).path;
    await audioPlayer.setSourceDeviceFile(path);

    AudioData audioData = AudioData.empty();
    audioData.audioId = id;
    audioData.stateStream = audioPlayer.onPlayerStateChanged;

    _audioPlayers.putIfAbsent(id, () => audioPlayer);
    _audioDatas.putIfAbsent(id, () => audioData);

    return audioData;
  }

  Future<void> play(String id) async {
    String path = (await DefaultCacheManager().getSingleFile(_url(id))).path;
    await _audioPlayers[id]?.play(DeviceFileSource(path));
  }

  Future<void> pause(String id) async {
    await _audioPlayers[id]?.pause();
  }

  Future<void> resume(String id) async {
    await _audioPlayers[id]?.resume();
  }

  String _url(String id) {
    return "$baseUrl/$id";
  }
}

class AudioData {
  String audioId = "";
  Stream<PlayerState>? stateStream;

  AudioData({
    this.stateStream,
  });

  AudioData.empty();
}

`[✓] Flutter (Channel stable, 3.3.8, on macOS 12.2.1 21D62 darwin-arm, locale de-DE) [✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0) [✓] Xcode - develop for iOS and macOS (Xcode 13.4.1) [✓] Chrome - develop for the web [✓] Android Studio (version 2021.2) [✓] IntelliJ IDEA Ultimate Edition (version 2022.1.3) [✓] IntelliJ IDEA Community Edition (version 2022.1.3) [✓] VS Code (version 1.73.1) [✓] Connected device (3 available) [✓] HTTP Host Availability

• No issues found!`

Tested on: Android 7.0, Huawei NOVA, EMUI 5.0.3

pubspec.yaml: audioplayers: ^1.1.1 flutter_cache_manager: ^3.3.0

Gustl22 commented 1 year ago

I can reproduce in your code sample, but unfortunately it's too much boilerplate code to detemine the source of the issue. You may can remove all the code that is unnecessary. E.g. I tried without the cache (UrlSource only) and could also reproduce.

Then I created an extra test for it: https://github.com/Gustl22/audioplayers/blob/feat/test-alternating-sources/packages/audioplayers/example/integration_test/lib_test.dart#L103

Maybe you can adapt it, so it leads to your issue. Sorry I don't have that much time to debug the whole code and get the reason.

JohnnyRainbow81 commented 1 year ago

@Gustl22 Thank you for your investigation! I will have a look into your code and give it a try...

JohnnyRainbow81 commented 1 year ago

Until now I can only say that the bug is Android-exclusive! Besides my old Android 7 device I just tested the code above on a later Android 10 phone and the bug still exists.

On my iPhone 8 with iOS 16 I get the expected behavior, everything works fine there.

This might be a native issue or an issue of handling the streams on the Dart side in relation to iOS / Android. I don't think I'll be able to fix that.

@Gustl22 You seem to be pretty proficient with the library. I don't expect that you or somebody else fixes this bug for me, but maybe you can point me in a direction where I can start to debug on the native side..

maxstubbersfield commented 1 year ago

Hi there, I also have the same issue (or at least very similar). I may have found the cause of the issue from looking at the Kotlin code, although my knowledge of Kotlin is very basic so I could definitely be wrong!

I think the issue might be that FocusManager is listening for changes in the audio focus and when it regains focus it automatically starts playing (provided it is not playing and not released).

This would look like the following:

  1. playerA starts playing (has focus)
  2. playerA is paused
  3. playerB starts playing (gains focus)
  4. playerB completes (loses focus)
  5. playerA resumes playing as it regains the focus
    • handleFocusResult would be called by the OnAudioFocusChangeListener in FocusManager (for playerA)
    • actuallyPlay in WrappedPlayer would then be called

Maybe someone familiar with the code can confirm if this is the case.

Gustl22 commented 1 year ago

@JohnnyRainbow81 @maxstubbersfield Thank you for your input! I think the best would be to adapt the test so it reliably produces the error described by @maxstubbersfield. Currently I have much work to do, and cannot promise any fix in the near future, so if you manage to fix the issue with and validate with a test, this would be really relieving.