ryanheise / audio_service

Flutter plugin to play audio in the background while the screen is off.
796 stars 476 forks source link

MediaItem to accept additional property for albumArt as Byte array #603

Closed talamaska closed 3 years ago

talamaska commented 3 years ago

Is your feature request related to a problem? Please describe. Working on a music/podcast player. I hook up everything, and everythng works as expected. There are couple of things I would like to see implemented some day. I need MediaItem to accept album art as byte arrays because there is no way of getting album art in Android 10+.

Describe the solution you'd like I would like to be able to set a new property albumArtBytes as Uint8List (byte array)

Describe alternatives you've considered I tried different urls in attempt to pass something meaningful to the artUri. Looked at what provides FlutterAudioQuery. AlbumUri Stopping and starting the service for each song is a no go as there is significant latency when starting the audio service. In Android 10 there is no way to get the alum art uri, All that we have is MediaDataRetriever had getEmbededPicture, the ContentResolver has his loadThumbnail method. So in practice if I want to load local file album art, the only way is to get a byte array.

Additional context

Current

image

Wanted image

image

ryanheise commented 3 years ago

Android's media style notification already sets the notification colour for you based on the current artwork, and this is demonstrated by the example. As you can see in the example, the notification colour ends up being red which is automatically determined because the artwork contains red. If you set a new media item (without restarting the service) that happens to have a different dominant colour, it will automatically update the notification's background colour accordingly.

Is this sufficient, or are you saying you prefer not to let it happen automatically so that you can choose your own notification colour dynamically? I'm not sure if this is possible, I think the notification colour is only used when there is no artwork. If you have an artUri, the artwork will show and it will override any configured notification colour with a colour automatically picked from the artwork.

talamaska commented 3 years ago

Honestly I haven't run the example and that's my bad, but it's data is all nice and hardcoded something very far from what I have, The problem is that when using FlutterAudioQuery the albumArt ends up null So I have to make a separate call to get the art work which returned as Uint8List (bytearray) I'm not sure how feasible it will be if I have to request all the images upfront before sending the playlist to the audio service, in addition the artUri cannot accept inMemory images such as those byte arrays I get. I tried with another possible url which is content://media/external/audio/albumart/ + albumId. It didin't change the notification and it didn't show the artwork. So I'm kinda stuck. If you think I'm doing something wrong and this is already implemented as you say, feel free to close the issue

talamaska commented 3 years ago

Android 10 is making the life harder. Turns out there is no way you can get the art uri, just bytes. MediaDataRetriever had getEmbededPicture, the ContentResolver has this loadThumbnail method and that's it. If there is a way MediaItem to accept Uint8List I'll be happy.

ryanheise commented 3 years ago

You can read the bytes and write them to a file, then pass its file path as the artUri. You can also progressively update media items with new metadata whenever you want, and an example of this is given in the FAQ with duration. You can also search past issues to find an example with duration.

talamaska commented 3 years ago

Writing thumbs to the phone is inpractical. I'm not sure are you joking about that suggestion or you are seriously thinking about as a solution. That way you'll create basically a second copy of the thumbs, cause it is already on the phone in the Music/.thumbs, thus depending on the number of the songs, you can take from the users phone an infinite amount of Mbs with duplicated thumbs. I looked at your code, it makes a request to the art uri and turn that to Bitmap. So having a byte array for the album art and turn it to bitmap will be easier and faster. It will even speed up all that process for local files.

ryanheise commented 3 years ago

Writing thumbs to the phone is inpractical. I'm not sure are joking about that suggestion or you are seriously thinking about as a solution.

I'm sorry if you felt I was giving you a joke answer but I was honestly trying to provide you with a workaround. This feature has been requested before along with the ability to load artwork from an asset (i.e. artBytes and artAsset), and it is appealing, but the implementation is more complicated than you might think, and there are efficiency issues that you may not have considered. And so when this feature is implemented, the first implementation of it will most likely use files as an intermediate step.

Although even the initial implementation will have to wait until the dust settles from the next two releases.

That way you'll create basically a second copy of the thumbs, cause it is already on the phone in the Music/.thumbs, thus depending on the number of the songs, you can take from the users phone an infinite amount of Mbs with duplicated thumbs.

You might be exaggerating a bit here. You would not need to have an infinite amount of Mbs if you implement a limited-size FIFO cache. You strictly do not need to hold in the cache more items than are currently used, and I would expect it to be performant enough to first copy the bytes to a file considering that it is being shuffled around locally on the same device.

I looked at your code, it makes a request to the art uri and turn that to Bitmap. So having a byte array for the album art and turn it to bitmap will be easier and faster. It will even speed up all that process for local files.

What happens is that the art URI is downloaded into a FIFO cache, and then the file is passed on to the platform side.

It is worth considering that the way the Android system handles artwork in the notification is that it represents it as a raw bitmap (uncompressed). You do not want to hold too many of these in memory at once, so you do need some sort of file-based cache to get the most optimal performance on Android. Android's own guidelines recommend the use of a file-based cache when preparing artwork for the notification and lock screen. One more consideration is the fact that these MediaItem objects are transmitted very frequently over method channels and need to be lean, otherwise you would get a serious bottleneck in performance from constantly bussing these bytes over the platform channel. Caching is essential to the solution.

talamaska commented 3 years ago

I'll consider that FIFO cache approach, thanks for explaining. Maybe I was a bit overexagereting, but imagine a playlist with 200 songs, Android creates a thumb or every one of them and saves them with the id of the song comming from the contentProvider. It doesn't compare the images and you end up with a copy of the same image with a different name. So you end up with 200 images hold in the app. On top of that I'll have to take care of removing the images that are not used anymore. Maybe I'm not understanding that approach well. Will look at this flutter_cache_manager that you are leveraging.

Thanks for spending time on this.

talamaska commented 3 years ago

ok, I did some logic where I cache the embeddedArtwork in a file it is stored in the temporary folder of the app /data/user/0/full.android.app.name/cache/thumbs/26 I pass that to AudioMetadata artwork and to the AudioService.setQueue with MediaItems having that path set.

@override
  Future<void> onUpdateQueue(List<MediaItem> queue) async {
    print('onUpdate queue ${queue.length}');
    queueList = queue;
    await AudioServiceBackground.setQueue(queue);
    await _player.setAudioSource(ConcatenatingAudioSource(
      children: queue
          .map(
            (MediaItem item) => AudioSource.uri(
              Uri.parse(item.id),
              tag: AudioMetadata(
                album: item.album,
                title: item.title,
                artist: item.artist,
                artwork: item.artUri,
              ),
            ),
          )
          .toList(),
    ));
  }

And the notification still doesn't show anything What I'm doing wrong?

ryanheise commented 3 years ago

The notification is set via the current media item rather than the queue, so depending on which version of audio_service's you're using, that's AudioServiceBackground.setMediaItem or AudioHandler.mediaItem.add.

talamaska commented 3 years ago

audio_service: ^0.16.2+1 I do have setMediaItem in my background service

import 'dart:async';

import 'package:audio_service/audio_service.dart';
import 'package:audio_session/audio_session.dart';
import 'package:just_audio/just_audio.dart';
import 'package:podcasts_classic/models/audio_states.dart';

/// This task defines logic for playing a list of podcast episodes.
class AudioPlayerTask extends BackgroundAudioTask {
  final AudioPlayer _player = AudioPlayer();
  AudioProcessingState _skipState;
  Seeker _seeker;
  StreamSubscription<PlaybackEvent> _eventSubscription;

  List<MediaItem> queueList;
  int get index => _player.currentIndex ?? 0;
  MediaItem get mediaItem => index == null ? null : queueList[index];

  @override
  Future<void> onCustomAction(String name, dynamic arguments) async {
    if (name == 'setVolume') {
      _player.setVolume(arguments as double);
    }
  }

  @override
  Future<void> onStart(Map<String, dynamic> params) async {
    // We configure the audio session for speech since we're playing a podcast.
    // You can also put this in your app's initialisation if your app doesn't
    // switch between two types of audio as this example does.
    final AudioSession session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration.music());
    // Broadcast media item changes.
    _player.currentIndexStream.listen((int index) {
      if (index != null && queueList != null && queueList.isNotEmpty) {
        print('onstart set media item');
        AudioServiceBackground.setMediaItem(queueList[index]); // <--- here I can clearly see this is called with proper data
      }
    });
    // Propagate all events from the audio player to AudioService clients.
    _eventSubscription =
        _player.playbackEventStream.listen((PlaybackEvent event) {
      print('playbackEventStream $event');
      _broadcastState();
    });
    // Special processing for state transitions.
    _player.processingStateStream.listen((ProcessingState state) {
      switch (state) {
        // case ProcessingState.completed:
        //   break;
        case ProcessingState.ready:
          // If we just came from skipping between tracks, clear the skip
          // state now that we're ready to play.
          _skipState = null;
          break;
        default:
          break;
      }
    });

    _player.playerStateStream.listen((PlayerState playerState) {
      print('playerStateStream $playerState');
      // ... and forward them to all audio_service clients.
      AudioServiceBackground.setState(
        playing: playerState.playing,
        // Every state from the audio player gets mapped onto an audio_service state.
        processingState: {
          ProcessingState.idle: AudioProcessingState.none,
          ProcessingState.loading: AudioProcessingState.connecting,
          ProcessingState.buffering: AudioProcessingState.buffering,
          ProcessingState.ready: AudioProcessingState.ready,
          ProcessingState.completed: AudioProcessingState.completed,
        }[playerState.processingState],
        // Tell clients what buttons/controls should be enabled in the
        // current state.
        controls: [
          MediaControl.skipToPrevious,
          if (playerState.playing) MediaControl.pause else MediaControl.play,
          MediaControl.stop,
          MediaControl.skipToNext,
        ],
        // playing: playerState.playing,
        // position: _player.position,
        // bufferedPosition: _player.bufferedPosition,
        // speed: _player.speed,
        androidCompactActions: [0, 1, 3],
      );
    });
  }

  @override
  Future<void> onSkipToQueueItem(String mediaId) async {
    print('onSkipToQueueItem');
    // Then default implementations of onSkipToNext and onSkipToPrevious will
    // delegate to this method.
    final int newIndex =
        queueList.indexWhere((MediaItem item) => item.id == mediaId);
    if (newIndex == -1) return;
    // DfileUring a skip, the player may enter the buffering state. We could just
    // propagate that state directly to AudioService clients but AudioService
    // has some more specific states we could use for skipping to next and
    // previous. This variable holds the preferred state to send instead of
    // buffering dfileUring a skip, and it is cleared as soon as the player exits
    // buffering (see the listener in onStart).
    _skipState = newIndex > index
        ? AudioProcessingState.skippingToNext
        : AudioProcessingState.skippingToPrevious;
    // This jumps to the beginning of the queue item at newIndex.
    _player.seek(Duration.zero, index: newIndex);
  }

  @override
  Future<void> onUpdateQueue(List<MediaItem> queue) async {
    print('onUpdate queue ${queue.length}');
    queueList = queue;
    await AudioServiceBackground.setQueue(queue);
    await _player.setAudioSource(ConcatenatingAudioSource(
      children: queue
          .map(
            (MediaItem item) => AudioSource.uri(
              Uri.parse(item.id),
              tag: AudioMetadata(
                  album: item.album,
                  title: item.title,
                  artist: item.artist,
                  artwork: item.artUri),
            ),
          )
          .toList(),
    ));
  }

  @override
  Future<void> onPlay() => _player.play();

  @override
  Future<void> onPause() => _player.pause();

  @override
  Future<void> onSeekTo(Duration position) => _player.seek(position);

  @override
  Future<void> onFastForward() => _seekRelative(fastForwardInterval);

  @override
  Future<void> onRewind() => _seekRelative(-rewindInterval);

  @override
  Future<void> onSeekForward(bool begin) async => _seekContinuously(begin, 1);

  @override
  Future<void> onSeekBackward(bool begin) async => _seekContinuously(begin, -1);

  @override
  Future<void> onStop() async {
    await _player.dispose();
    _eventSubscription.cancel();
    // It is important to wait for this state to be broadcast before we shut
    // down the task. If we don't, the background task will be destroyed before
    // the message gets sent to the UI.
    await _broadcastState();
    // Shut down this task
    await super.onStop();
  }

  @override
  Future<void> onSkipToNext() {
    return _skip(1);
  }

  @override
  Future<void> onSkipToPrevious() {
    return _skip(-1);
  }

  Future<void> _skip(int offset) async {
    final MediaItem mediaItem = AudioServiceBackground.mediaItem;
    if (mediaItem == null) return;
    final List<MediaItem> queue = AudioServiceBackground.queue ?? <MediaItem>[];
    final int i = queue.indexOf(mediaItem);
    if (i == -1) return;
    int newIndex = i + offset;
    if (newIndex == queue.length) {
      newIndex = 0;
    }
    if (newIndex < 0) {
      newIndex = queue.length - 1;
    }
    if (newIndex >= 0 && newIndex < queue.length)
      await onSkipToQueueItem(queue[newIndex]?.id);
  }

  /// Jumps away from the current position by [offset].
  Future<void> _seekRelative(Duration offset) async {
    Duration newPosition = _player.position + offset;
    // Make sure we don't jump out of bounds.
    if (newPosition < Duration.zero) newPosition = Duration.zero;
    if (newPosition > mediaItem.duration) newPosition = mediaItem.duration;
    // Perform the jump via a seek.
    await _player.seek(newPosition);
  }

  /// Begins or stops a continuous seek in [direction]. After it begins it will
  /// continue seeking forward or backward by 10 seconds within the audio, at
  /// intervals of 1 second in app time.
  void _seekContinuously(bool begin, int direction) {
    _seeker?.stop();
    if (begin) {
      _seeker = Seeker(_player, Duration(seconds: 10 * direction),
          const Duration(seconds: 1), mediaItem)
        ..start();
    }
  }

  /// Broadcasts the current state to all clients.
  Future<void> _broadcastState() async {
    await AudioServiceBackground.setState(
      controls: [
        MediaControl.skipToPrevious,
        if (_player.playing) MediaControl.pause else MediaControl.play,
        MediaControl.stop,
        MediaControl.skipToNext,
      ],
      systemActions: [
        MediaAction.seekTo,
        MediaAction.seekForward,
        MediaAction.seekBackward,
      ],
      androidCompactActions: [0, 1, 3],
      processingState: _getProcessingState(),
      playing: _player.playing,
      position: _player.position,
      bufferedPosition: _player.bufferedPosition,
      speed: _player.speed,
    );
  }

  /// Maps just_audio's processing state into into audio_service's playing
  /// state. If we are in the middle of a skip, we use [_skipState] instead.
  AudioProcessingState _getProcessingState() {
    if (_skipState != null) return _skipState;
    switch (_player.processingState) {
      case ProcessingState.idle:
        return AudioProcessingState.stopped;
      case ProcessingState.loading:
        return AudioProcessingState.connecting;
      case ProcessingState.buffering:
        return AudioProcessingState.buffering;
      case ProcessingState.ready:
        return AudioProcessingState.ready;
      case ProcessingState.completed:
        return AudioProcessingState.completed;
      default:
        throw Exception('Invalid state: ${_player.processingState}');
    }
  }
}

class Seeker {
  Seeker(
    this.player,
    this.positionInterval,
    this.stepInterval,
    this.mediaItem,
  );
  final AudioPlayer player;
  final Duration positionInterval;
  final Duration stepInterval;
  final MediaItem mediaItem;
  bool _running = false;

  Future<void> start() async {
    _running = true;
    while (_running) {
      Duration newPosition = player.position + positionInterval;
      if (newPosition < Duration.zero) newPosition = Duration.zero;
      if (newPosition > mediaItem.duration) newPosition = mediaItem.duration;
      player.seek(newPosition);
      await Future<dynamic>.delayed(stepInterval);
    }
  }

  void stop() {
    _running = false;
  }
}
album:"some album"
artUri:"/data/user/0/com.android.full.name/cache/thumbs/25"
artist:"some artist"
displayDescription:null
displaySubtitle:null
displayTitle:null
duration:Duration (0:07:36.385000)
extras:null
genre:null
id:"content://media/external/audio/media/26"
playable:true
rating:null
title:"some song title"
hashCode:430258969
runtimeType:Type (MediaItem)
ryanheise commented 3 years ago

I probably won't be able to reproduce the issue with the above code alone - it would be best if possible to share a link to a git repo containing everything required to reproduce the problem.

talamaska commented 3 years ago

I'm not sure I can make a deamoable issue easely. This whole stuff requires multiple steps: make an app with audio quesry and audio service manual dnd copy a song on the phone, manual move the song into Music folder via files app. start the app read the audios on the phone - get a list of songs get embeded art for at least one song, save as file in temporary folder get the path set playlist with at least one song with the album uri. I can make public my repo in bitbucket with the whole code and also I din't use the flutter_audio_query in the end but have my own package doing the things I need - get all song, get meta for a song, get art git@bitbucket.org:talamaska/podcastsclassic.git https://github.com/talamaska/flutter_id3_reader - the unpublished package I use to get songs. All you have to do s clone the repo add song to the phone, move the song to music folder and run the app, tap on a song on the first screen, the audio playing service will initiate if it's not and start playing. Please give me a sign if you have cloned the repo I don't want to keep it public.

ryanheise commented 3 years ago

I'm sorry, I generally do not look at people's whole projects as it is too time consuming and distracts me from working on the plugin. Though if you create a minimal reproduction project, I will look at it. I do not think that creating a minimal reproduction should be as complicated as you make it sound. For instance, there is no way that audio query is a necessary ingredient to trigger such a bug - bytes are bytes, so you ought to be able to short circuit this by including the problematic bytes as an asset in your example. In the process of creating a minimal reproduction project, you may find that it actually works. If so, great! It means you have discovered the problem. Otherwise if your minimal reproduction project still fails, also great! It means it is narrow enough that I can take the time to look at it.

talamaska commented 3 years ago

ok, so maybe just copy this library example and copy an image from assets to temp, and launch the audio service with a single file, probably even remote one, like in your examples?

ryanheise commented 3 years ago

Yes, typically the fastest way to create a minimal reproduction project is to fork this repo and make minimal changes to its example.

talamaska commented 3 years ago

Here it is. I couldn't figure out how to start with empty playlist - the audio service is shutting down and I couldn't figure out how to not play automatically. But the code for passing the stored on phone file is there. move the image first, then start audio player then update the playlist. you'll see the title change to "test ...something" and not image and the background is the default one. https://github.com/talamaska/audio_service

talamaska commented 3 years ago

I see this change Change artUri type from String to Uri. Is it related to handling paths to local assets?

ryanheise commented 3 years ago

It was a already possible to use a file prior to this change. What this change does is just make the type more specialised.

talamaska commented 3 years ago

I've found what was wrong. Since the latest updates the artUri is no longer a String, plus URI.parse was not the correct way of passing the path to a local file. I used URI.file('/data/user/0/com.ryanheise.audioserviceexample/cache/thumbs/mychachedimage.jpg') and now I see the art image in the notification and on top of that the notification changes colors of the background and the buttons and text automagically. This is great.

ryanheise commented 3 years ago

My apologies I have been less speedy/productive over the past months due to health issues, but I am glad you managed to track down the issue. I think I had hoped one of the benefits of narrowing the type to Uri would that it would ensure only valid URIs would be passed in, rather than arbitrary strings. I guess it is still possible to use Uri.parse(...) in a way that isn't right and yet doesn't raise any exception, so perhaps I should improve this with documentation.

github-actions[bot] commented 2 years ago

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs, or use StackOverflow if you need help with audio_service.