ryanheise / just_audio

Audio Player
1.03k stars 646 forks source link

Gapless playlists, composable looping, clipping and shuffling (TESTERS NEEDED!) #131

Closed ryanheise closed 3 years ago

ryanheise commented 4 years ago

Hey everyone. This set of features has involved a huge amount of work. Actually, it would have been a day's work for Android, a week for Web, and 3 weeks for iOS (due to its weird APIs, and still it is not 100% perfect).

You can try it on the media-source branch (edit: this has now been merged to master).

I have modeled this set of features after ExoPlayer's composable media source API, and emulated this API for web and iOS:

player.load(audioSource)     // edit - I originally described the wrong method name

There are the following subclasses of AudioSource:

The original API still exists as convenience methods.

I expect there will need to be a long period of testing before I can release this to iron out the major bugs, at least to the point where the previous feature set works properly as I wouldn't want to break anything. So I would really appreciate having some people test the new features and report any bugs.

P.S. I really appreciate the support of my GitHub sponsors @yringler , @snaeji , @nicoeg , @getmmg , @duncan-iaria , and @moritz-weber as I probably would not have put this amount of time into it had it not been for their support. As I've been stuck on this one thing for a month, I hope I'll soon be able to start making progress in other areas, including audio_service.

fuzing commented 4 years ago

Ryan -

Kudos - and thank you so much for this!

Just so I have the API correct - is this the general idea?:

final AudioPlayer audioPlayer = AudioPlayer(); final uri = Uri.parse('https://somesite.com/media.mp4'); final audioSource = LoopingAudioSource(audioSource: AudioSource.uri(uri)); await audioPlayer.setAudioSource(audioSource); audioPlayer.play();

The only reason I ask is that I'm not seeing the .setAudioSource() method on the player, although the other steps appear to work (using the media-source branch).

Also, does looping extend to the old .setAsset(String uri), and .setFilePath(String uri) setters in some way....... or is this the new way? - for example..... final audioSource = LoopingAudioSource(audioSource: AudioSource.uri(Uri.parse('file://some-path'))); and final audioSource = LoopingAudioSource(audioSource: AudioSource.uri(Uri.parse('asset://some-path')));

Cheers, and again, thank you for all of your efforts!

yringler commented 4 years ago

Looks like it's called load (Seek here) Also, make sure to load from git. Specify a ref for the branch (media-source) or the commit hash. Personally I prefer using commit hash because it makes my code more deterministic. Whenever there's an update to the branch you have to update the hash, but I think that beats builds suddenly failing. See https://github.com/hivedb/hive/issues/337#issuecomment-641687964 for an example which uses something else, dependency overrides.

fuzing commented 4 years ago

ahhhh - thank you - I'll try load().

I appreciate your quick feedback.

On Mon, Jul 27, 2020 at 5:37 PM Yehuda Ringler notifications@github.com wrote:

Looks like it's called load (Seek here https://github.com/ryanheise/just_audio/blob/media-source/lib/just_audio.dart#L308 ) Also, make sure to load from git https://flutter.dev/docs/development/packages-and-plugins/using-packages#dependencies-on-unpublished-packages . Specify a ref for the branch (media-source) or the commit hash. Personally I prefer using commit hash because it makes my code more deterministic. Whenever there's an update to the branch you have to update the hash, but I think that beats builds suddenly failing. See hivedb/hive#337 (comment) https://github.com/hivedb/hive/issues/337#issuecomment-641687964 for an example which uses something else, dependency overrides.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/ryanheise/just_audio/issues/131#issuecomment-664692073, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACY4CFMQHKAXNXU6JR5EJOLR5YFRXANCNFSM4PJCYMQQ .

fuzing commented 4 years ago

I can confirm that I have looping running on android using code similar to the following:

final AudioPlayer audioPlayer = AudioPlayer(); final uri = Uri.parse('https://somesite.com/media.mp4'); final audioSource = LoopingAudioSource(audioSource: AudioSource.uri(uri), count: 100); await audioPlayer.load(audioSource); audioPlayer.play();

My audio fails to play if "count" is omitted. Perhaps this is the intended behavior (i.e. not passing count [null] defaults to zero plays? - I'd recommend defaulting to 1 play, or requiring a count for looping. Also, would recommend perhaps a count of -1, or omitted/null indicates infinite looping, until stop/pause/dispose called).

I'll continue to test on android and ios. Well done!

ryanheise commented 4 years ago

@fuzing Thanks for pointing that out. Originally I had intended to make it infinite by default, but only later did I realise this would be difficult to implement on other platforms. Perhaps until then it would be better for me to make it required so that I always have the option to add an infinite default in the future.

Instead, if you wish to loop an infinite number of times, you should use:

player.setLoopMode(...);

There are 3 modes:

Also note that the way I have implemented LoopingAudioSource it is not efficient beyond a small number of repetitions since at least on iOS it will duplicate entries in the queue. in your case, that's 100 duplicates. So its better to use setLoopMode and if you really need to stop after 100 you could do your own counting

I'm not sure if you actually wanted to loop 100 times, or whether you just picked some really large number to emulate infinite looping. If the former, I don't think I actually have an event which you could listen to for that, though. the completed state is only reached when playback stops. I guess you could observe the position and whenever it is updated to zero that could let you know to increment your counter, but here, be aware that seeking often isn't precise, so sometimes when it loops back to zero internally, the position may actually show up as 7 milliseconds instead of 0. Another thing you could do is observe the update position and trigger a count whenever you observe it jumping from approximately the duration to approximately zero (within say 100ms), or you could just have a Dart timer that waits for (item duration)*99.5 and then sets the loop mode to off. Obviously I need to have a think about the API a bit more for this use case, although setLoopMode was really intended for music players where the user presses a loop button to turn it off themselves.

fuzing commented 4 years ago

Ryan - thanks - all good info.

My use-case simply requires me to loop a short clip (white-noise, a waterfall... whatever) indefinitely until the user wants it to stop....... so I'll utilize .setLoopMode() as suggested. I also greatly appreciate your insights vis. events/timing/etc (stuff that I'm sure would bite me in the absence of such knowledge). I'll also be doing some IOS testing next week, so I'll incorporate the new just_audio stuff into that, and try to provide feedback. Be well and have a great day.

On Mon, Jul 27, 2020 at 8:16 PM ryanheise notifications@github.com wrote:

@fuzing https://github.com/fuzing Thanks for pointing that out. Originally I had intended to make it infinite by default, but only later did I realise this would be difficult to implement on other platforms. Perhaps until then it would be better for me to make it required so that I always have the option to add an infinite default in the future.

Instead, if you wish to loop an infinite number of times, you should use:

player.setLoopMode(...);

There are 3 modes:

  • off: Play everything just once
  • one: Loops the current item indefinitely (until you set loop mode to "off")
  • all: Loop the sequence of all items indefinitely (until you set loop mode to "off")

Also note that the way I have implemented LoopingAudioSource it is not efficient beyond a small number of repetitions since at least on iOS it will duplicate entries in the queue. in your case, that's 100 duplicates. So its better to use setLoopMode and if you really need to stop after 100 you could do your own counting

I'm not sure if you actually wanted to loop 100 times, or whether you just picked some really large number to emulate infinite looping. If the former, I don't think I actually have an event which you could listen to for that, though. the completed state is only reached when playback stops. I guess you could observe the position and whenever it is updated to zero that could let you know to increment your counter, but here, be aware that seeking often isn't precise, so sometimes when it loops back to zero internally, the position may actually show up as 7 milliseconds instead of

  1. Another thing you could do is observe the update position and trigger a count whenever you observe it jumping from approximately the duration to approximately zero (within say 100ms), or you could just have a Dart timer that waits for (item duration)*99.5 and then sets the loop mode to off. Obviously I need to have a think about the API a bit more for this use case, although setLoopMode was really intended for music players where the user presses a loop button to turn it off themselves.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ryanheise/just_audio/issues/131#issuecomment-664735621, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACY4CFKEAWNQF3YLO65YIA3R5YYH3ANCNFSM4PJCYMQQ .

ryanheise commented 4 years ago

I've just updated the documentation, and made a small API change for composable audio sources to make them more closely resemble Flutter's composable widget API. Now, the children of a ConcatenatingAudioSource are called children rather than audioSources, and the child of a ClippingAudioSource is child child rather than audioSource.

Now would also be a good time to make API naming decisions and take any suggestions people have.

E.g. should I come up with shorter names than the ExoPlayer-inspired ConcatenatingAudioSource, LoopingAudioSource and ClippingAudioSource?

I would prefer single-word names but at the same time that risks leading to clashes with future official Flutter class names. But for example, here are some ideas:

There is a standard class in Android called Looper but there is not (yet) any class by that name in Flutter. In Flutter's naming for its own widgets, subclasses of Widget tend not to have any "Widget" suffix in their names which helps with brevity when composing widget trees.

An alternative is to just add convenience factory constructors (or static methods) on AudioSource:

But there would still need to be actual classes behind each of these.

fuzing commented 4 years ago

In terms of naming I think your static factory methods (AudioSource.loop(), etc.) combined with the more verbose "android style" class-names (i.e. prefer HlsAudioSource vs. simply Hls) is the way to go - the abridged names say nothing about context (and I'm old, so I tend to forget what goes where!!). A caveat here - I'm pretty new to dart/flutter, so would defer to whatever the "norms" are.

rohansohonee commented 4 years ago

Hi @ryanheise,

Hope you are doing well.

Congratulations! 🎉 Your efforts towards audio plugins for Flutter is amazing. It inspired me to develop a plugin for equalizer.

My thoughts are that should I integrate my Equalizer plugin inside just_audio. I have only contributed android side of the code. https://pub.dev/packages/equalizer

I also posted an audio sample app based on your plugins: https://github.com/rohansohonee/ufmp

It's mostly the example code from audio_service plugin, but adds some more minor things. You can have a look and also feature it in audio_service & just_audio readme.

P.S. are you a speedcuber? I visited your website.

fuzing commented 4 years ago

@ryanheise - I just managed to test a little on IOS and while looping is working using both methods:

player.setLoopMode(); and by instantiating via the LoopingAudioSource() method

I'm not getting gapless playback (single file looping, no playlist). There is an approximately 100ms stutter/silence when transitioning from the end back to the start. If this is a known issue being worked on then I'll stand by. Otherwise I'm happy to probe/dig a little more.

If it helps, I'm testing using IOS 12.4.8

ryanheise commented 4 years ago

@fuzing yes unfortunately that is correct for now. The "gapless" support on iOS runs out at the loop point due to the way I have implemented it. Currently I have this comment in the code:

                // TODO: Currently there will be a gap at the loop point.
                // Maybe we can do something clever by temporarily adding the
                // first playlist item at the end of the queue, although this
                // will affect any code that assumes the queue always
                // corresponds to a contiguous region of the indexed audio
                // sources.
                // For now we just do a seek back to the start.

That comment was in the "loop all" case but in the 'loop one" case I could do something similar. It's just that implementing this trick needs to be carefully so as not to break all the other code that makes assumptions about the queue. So it will be the last thing I tackle after everything else is stable.

In the meantime, what you could do is use a LoopingAudioSource to loop 10 times, and then also set loop mode to all so that you hear the gap only once every 10 times, and without using up too many resources. Perhaps you could monitor memory usage to see how far above 10 you could increase it. If you're lucky, iOS disposes of buffers that are being unused rather than keeping them all in memory.

ryanheise commented 4 years ago

@rohansohonee nice work! In terms of merging the two plugins, my main criteria is that I am a fan of single responsibility plugins. Ideally, I had envisaged a flutter audio plugin ecosystem where plugins could be mixed and matched together in the same app. When I created just_audio, unfortunately the state of affairs was (and still us) such that audio plugins started trying to take on multiple responsibilities which would result in them either stepping on each others' toes by overwriting each other's audio session settings, or having a unresolvable conflicts due to having overlapping responsibilities. So in my mind, I thought it would be nice if an audio player plugin "just" played audio, and an audio recorder plugin "just" recorder audio, etc. i.e. It would be nice if users could pick their favourite:

... and be able to import the ones they need for their app, and then they would all work happily together. This sort of plugin cooperation I think would be much more feasible with single responsibility audio plugins. At the same time, I wouldn't suggest that we go too finely-grained and create a separate plugin for every platform class! So related APIs should be provided under one plugin.

So in terms of the equalizer feature, I wonder whether it would be best to merge or treat separate? I suppose that just_audio is intended to "just" play audio, but an equaliser is related "enough" to the production of sound that you can justify having it in the same plugin. I also don't see the presence of an equaliser in the plugin causing any conflict with other plugins.

At the same time, if we kept it separate, I would definitely link to your plugin from the README.

One last thought: it could be a good idea to create a family of plugins that are in agreement with the above philosophy / manifesto. i.e. Audio plugins that are not monolithic, but are designed to work together nicely with other audio plugins. I know we have a lot of competition with some monolithic plugins that offer a certain level of convenience, but @yringler has demonstrated that it is possible to create a higher level plugin that depends on these individual components to create a convenient combination of these lower level plugins. This type of approach would promote more cooperation between plugins, offer more choice between alternatives, and in some cases result in less duplication of effort (i.e. rather than every plugin trying to duplicate every other feature of every other plugin, they could just focus on one responsibility).

There's a lot of work yet to be done to achieve this goal, but hopefully there are enough people interested in this to help make it happen.

P.S. Yes I'm a cuber :-) You can probably find a bit about that side of me searching for Ryan Heise on YouTube.

rohansohonee commented 4 years ago

Merging just_audio with equalizer will help by providing audioSessionId which is required by Android Equalizer to add the effects to the media. Currently just_audio does not have the API to retrieve current media's audioSessionId. Don't know how this would work going forward for iOS, Web or even Desktop. (Will they also have audioSessionId's or not?) I can only test and work on Android.

Thank you for providing your feedback. :-)

P.S. Amazing to find out you are also a cuber. Have you taken part in any speed-cubing competitions? Here is my WCA profile https://www.worldcubeassociation.org/persons/2011SOHO01 if you want to see. It's so random to meet a cuber on GitHub. What were the odds??

ryanheise commented 4 years ago

That's a good point, although I can conceive of (admittedly uncommon) scenarios where you might want to share the same session id between plugins. For example, if I ended up creating an audio synthesis plugin, and an app wanted both plugins to mix audio via the same session id. I'm fine to take either approach to start with, and we can always reconsider it in the future if another approach makes more sense.

Regarding iOS, it's unfortunately not as easy as it is on Android, but I found this link which might be useful: https://stackoverflow.com/questions/30218449/xcode-auipodeq-augraph

As for getting this to work with ExoPlayer, I found this open issue on their issues page: https://github.com/google/ExoPlayer/issues/3058 (maybe you're already familiar with it)

(On the cubing question, I'm a bit of a dinosaur. I was mainly active before the WCA was formed and so I never participated in the events. However, I started the original linear fewest moves competition back in the day, and coined the term "Linear FMC" which was later adopted by the WCA. I also created the first Rubik's Cube Simulator designed for speed cubing, although sadly it doesn't work anymore due to being a Java applet. These days I'm busy with life in general and so haven't bothered to update it.)

ryanheise commented 4 years ago

I have just completed the last planned change for the next release, which is the new state model based on ExoPlayer. It would probably be a good idea for me to explain the change somewhere to avoid any confusion.

The major change is that rather than having state + buffering, we now have state + playing. The playing boolean is switched via play() and pause() only, which means it stays true even after reaching the end of the track. So although the audio appears to stop when reaching the end of the track, the player is still in playing mode and if you seek to earlier in the track, you will hear audio continue. Previously when the player reached the end of the track, it would stop and rewind, which is not always what people want to happen, but now if you want that behaviour, you now have the flexibility to rewind and pause after it reaches the end. The other change is that there is no "stopped" state anymore. The underlying platform APIs don't have anything like this anyway, so you just basically use pause instead. Although I have added a stop method which is just a convenience method for pausing and seeking to zero.

Another change in the latest commits is to implement bufferedPosition and the buffering state correctly on web, as well as broadcast duration state changes.

One more change is a new positionStream which dynamically adjusts its frequency depending on the duration of the track. On long tracks where the seek bar moves more slowly, it emits position updates more slowly. On short tracks where the seek bar moves more quickly, it emits position updates more quickly.

I have also merged this with master since I would like to push out a release and it would be good to get more eyes on it.

jjb182 commented 4 years ago

Hi guys, so can you use the Looping and the PlayLists together?

I'm just sitting down to wrap my mind around this.

I'm having an issue where android devices (likely iOS too) stop my app from running in the background, and it kills the audio too. I'm programmatically looping 3 second files for 4 minutes, and then changing the file and looping again. After about 20 minutes the app stops doing background work and so stops. I thought that a playlist might help, but my playlist would have to loop each track about 80 times...

final audioSource1 = LoopingAudioSource(audioSource: AudioSource.uri(uri1), count: 80); final audioSource2 = LoopingAudioSource(audioSource: AudioSource.uri(uri2), count: 80); final audioSource3 = LoopingAudioSource(audioSource: AudioSource.uri(uri3), count: 80); AudioSource.playlist.add(audioSource1); AudioSource.playlist.add(audioSource2); AudioSource.playlist.add(audioSource3); await audioPlayer.load(AudioSource); audioPlayer.play();

Clearly I have NOT played with the code yet :)

ryanheise commented 4 years ago

Hi @jjb182, LoopingAudioSource isn't intended for large counts (or infinite counts) so instead you should use audioPlayer.setLoopMode(LoopMode.all) which will loop all items in your playlist forever (or until you turn loop mode off again). I have put some documentation on both APIs to that effect, although understandably since I haven't published this release yet the HTML documentation isn't generated yet. If you navigate to the method or class in the code, you can read the documentation.

Regarding background audio, you need to read the second sentence of the README file.

mohammadne commented 4 years ago

Hi @ryanheise i have a ConcatenatingAudioSource of List<AudioSource> in my playList.

for now , it will automatically handle player completion but I can't go to next and pervious items manually.

thanks for adding this awesome feature.

ryanheise commented 4 years ago

@mohammadne you can find this in the README:

Gapless playlists:

await player.load(
  ConcatenatingAudioSource(
    children: [
      AudioSource.uri(Uri.parse("https://example.com/track1.mp3")),
      AudioSource.uri(Uri.parse("https://example.com/track2.mp3")),
      AudioSource.uri(Uri.parse("https://example.com/track3.mp3")),
    ],
  ),
);
// Jump to the beginning of track3.mp3.
player.seek(Duration(milliseconds: 0), index: 2);

You can also get the current index via player.currentIndex.

ryanheise commented 4 years ago

I should also mention that I haven't fully tested dynamically adding and removing items from ConcatenatingAudioSource after it has been loaded. The example above has been tested (where the entire playlist is known on creation).

ryanheise commented 4 years ago

Since I haven't heard of any serious issues so far, I am planning to release this within the next day.

fuzing commented 4 years ago

@ryanheise - I've run numerous tests - ditto - nothing glaring. I'd go for it! Thanks, and well done.

ryanheise commented 4 years ago

Thanks for testing! I did find a few issues which I'm in the process of fixing. One is that the player can lose track of the current index when an error occurs.

I will post here once I've fixed it.

ryanheise commented 4 years ago

There was also an issue on iOS where positionStream wouldn't update.

ryanheise commented 4 years ago

I've fixed those errors (or, I could have introduced new ones, but hopefully not).

I also added some new exception classes which can be thrown by load instead of generic ones.

This is a release candidate. I'll release it maybe in an hour or two unless anyone discovers further issues.

ryanheise commented 4 years ago

It's probably too late for me to consider this now that I'm on the verge of releasing, but I thought of another naming strategy for the composable audio elements:

Not too sure about the others, though:

Part of me thinks it doesn't matter having such long class names, but the other part of me likes single-word names. My only concern is that they may conflict with future class names in the standard Flutter/Dart libraries.

Either way, I will go ahead and release it as is, and I can change the names later if there's a convincing reason or urge to.

ryanheise commented 4 years ago

0.3.0 release! Thank you to everyone for your suggestions and testing.

Next, I plan to update the audio_service example to use these new features, and get out a new audio_service release.

nuc134r commented 4 years ago

That's very good news. Will get my hands on it as soon as possible.

Was dreaming of these features but couldn't implement them even in Java due to lack of time. It will be very handy to have them on Dart level for sure.

Ryan, you rock. I have no idea how you have so much time and energy to maintain this set of Flutter background audio libraries in this wholesome manner. I guess you do something related full time. Do you?

As a metter of fact, Ryan is a hero we all need but do not deserve.

yringler commented 4 years ago

Part of me thinks it doesn't matter having such long class names, but the other part of me likes single-word names. My only concern is that they may conflict with future class names in the standard Flutter/Dart libraries.

I like the longer class names. One word is too short IMO - Loop, Clip etc are very generic. In general, I recall reading that, when using inheritance, the child class name should add on to the parent class name. For example a Human class would have a BigHuman child class. I can't remember where I saw that, but that's what I've been doing for a while now, and I think it works well. This is a borderline totally out of left field example, but the wonderful vuejs recommends very long component names for tightly coupled UI components.

ryanheise commented 4 years ago

Thanks, @nuc134r :-)

Most of the plugin development gets done at night after I'm finished work (programming), as well as weekends. Last night I published the new release around 4am (Although that sounds more extreme than it is, since I tend to wake up around 11am most days.)

ryanheise commented 4 years ago

I like the longer class names. One word is too short IMO - Loop, Clip etc are very generic. In general, I recall reading that, when using inheritance, the child class name should add on to the parent class name. For example a Human class would have a BigHuman child class. I can't remember where I saw that, but that's what I've been doing for a while now, and I think it works well. This is a borderline totally out of left field example, but the wonderful vuejs recommends very long component names for tightly coupled UI components.

This is indeed the standard I have used in Java-style languages, although it's worth keeping in mind that these long names often are a consequence of poorly designed languages. For example, before object orientation came along, a function for moving a window might have been called MoveWindow. Ever since methods could be scoped within classes, this was shortened to move within the scope Window. Java could never really extend this benefit beyond classes to packages though since the import mechanism didn't allow you to import different packages under different namespaces. Dart has this feature, so you can have generic math constants like pi but you can import them via a namespace, and refer to them as math.pi if you want to avoid clashes. I think Java and C++-style languages have been dominant in OO for so long now that for many people these conventions can feel like universal rules, although if I'm thinking idealistically, I think we should hope to move in the direction of better languages rather than conventions to work around language limitations. I don't think Dart has it all sorted out, but at least this provides some intuition as to why I'm attracted to single word names. It is also perhaps telling that the Dart/Flutter API designers have used such short names themselves, which indicates that this may be what they had in mind for idiomatic Dart code.

ryanheise commented 4 years ago

Made a small bug fix release 0.3.1.

Also, the audio_service example is now updated to use just_audio 0.3.1. I haven't yet demonstrated the shuffle and loop features yet, I've only rewritten the existing functionality based on the new version, and certainly it has made a huge difference in the simplicity of the code.

One of the new features in audio_service is the ability to use seek forward/backward in the iOS control center which are buttons that, if you press and hold, continuously fast forward or rewind the audio. Those are really the only callbacks that have a non-trivial implementation since just_audio doesn't have such a feature to delegate to. However, it is making me consider building this feature into just_audio.

If I add this feature to just_audio, the only question is, how should it work? continuous fast forward/rewind can be implemented in various ways. The way I implemented it is to periodically seek to currentPosition + offset where the period and offset are parameters.

Would it be useful to anyone for me to add that to just_audio, or is there an alternative preferred method of implementing seek forward/backward?

ryanheise commented 4 years ago

Thanks to @nuc134r for fixing a bug in the methods for dynamically adding to and removing from a ConcatenatingMediaSource on Android.

I also got the iOS implementation working now too, although for some reason there is a small glitch in the audio when you add or remove while its playing.

I've also added two new streams:

  1. sequenceStream broadcasts the current list of IndexedAudioSources
  2. sequenceState stream which broadcasts the pair of the current sequence and current index within that sequence. A guarantee is provided that the components of the sequenceState are updated consistently, so if the current index is right on the last item in the playlist, and then you delete it, you won't temporarily have the index pointing out of bounds, they will both update at the same time.
alexmercerind commented 4 years ago

Hi @ryanheise ! First of all thanks a LOT, for making my day better by making this module. (This question is bit apart from topic...) I wanted to ask

await player.load(
  ConcatenatingAudioSource(
    children: [
      AudioSource.uri(Uri.parse("https://example.com/track1.mp3")), //I'm saying here
      AudioSource.uri(Uri.parse("https://example.com/track2.mp3")),
      AudioSource.uri(Uri.parse("https://example.com/track3.mp3")),
    ],
  ),
);
// Jump to the beginning of track3.mp3.
player.seek(Duration(milliseconds: 0), index: 2);

How to load local file in AudioSource ? Sorry if this question is stupid.

ryanheise commented 4 years ago

Hi @alexmercerind , if you take a look at the docs for the AudioPlayer.setFilePath method, it might give you some idea.

getmmg commented 4 years ago

Hi @ryanheise

First of all Thank you for all your hardwork on this awesome update. I have just updated to new version 0.3.3. I am listening to IcyMetaDataStrem and send that information to UI using custom event

 _audioPlayer.icyMetadataStream.listen((IcyMetadata icyMetadata) {
      AudioServiceBackground.sendCustomEvent({icyMetaDataKey: icyMetadata});
    });

I am getting below exception thrown continuously.

I/flutter (15674): Error parsing event: type '_InternalLinkedHashMap<dynamic, dynamic>' is not a subtype of type 'IcyHeaders'
I/flutter (15674): #0      new IcyMetadata.fromJson (package:just_audio/just_audio.dart:818:47)
I/flutter (15674): #1      new AudioPlayer._internal.<anonymous closure> (package:just_audio/just_audio.dart:122:29)
I/flutter (15674): #2      _MapStream._handleData (dart:async/stream_pipe.dart:219:31)
I/flutter (15674): #3      _ForwardingStreamSubscription._handleData (dart:async/stream_pipe.dart:157:13)
I/flutter (15674): #4      _rootRunUnary (dart:async/zone.dart:1198:47)
I/flutter (15674): #5      _CustomZone.runUnary (dart:async/zone.dart:1100:19)
I/flutter (15674): #6      _CustomZone.runUnaryGuarded (dart:async/zone.dart:1005:7)
I/flutter (15674): #7      _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:357:11)
I/flutter (15674): #8      _DelayedData.perform (dart:async/stream_impl.dart:611:14)
I/flutter (15674): #9      _StreamImplEvents.handleNext (dart:async/stream_impl.dart:730:11)
I/flutter (15674): #10     _PendingEvents.schedule.<anonymous closure> (dart:async/stream_impl.dart:687:7)
I/flutter (15674): #11     _rootRun (dart:async/zone.dart:1182:47)
I/flutter (15674): #12     _CustomZone.run (dart:async/zone.dart:1093:19)
I/flutter (15674): #13     _CustomZone.runGuarded (dart:async/zone.dart:997:7)
I/flutter (15674): #14     _CustomZone.bindCallbackGuarded
ryanheise commented 4 years ago

Hi @getmmg , glad you like the updates! I'm currently I'm in the middle of some other changes which are taking a lot longer than expected, but this one is next on my todo list.

ryanheise commented 4 years ago

Hi @getmmg I've just committed a fix (hopefully). Would you like to test it on git master?

getmmg commented 4 years ago

Hi @ryanheise, its not throwing exceptions now and displaying proper data.

ryanheise commented 4 years ago

Great! Thanks for confirming that, I've now published 0.3.4.

alexda12 commented 4 years ago

@ryanheise - fantastic work and commitment on your part to create such a much needed plugin - kudos to you.

I've read this entire thread with regards to gapless/looping in both iOS and Android and note from your comments here https://github.com/ryanheise/just_audio/issues/131#issuecomment-666046659 that you acknowledge in iOS that there is still some more work to be done in terms of having a continuous gapless looping in iOS.

I just wanted to follow up the status on this - i've tried 0.3.4 and still notice a 50-100ms gap in iOS.

Additionally - in Android , there is also a very tiny gap (approx 10-20ms) - I'm loading assets to play theme music on completion of post callback :

... initState ...
 WidgetsBinding.instance.addPostFrameCallback(_playTrack);
...
 Future<void> _setupAudioPlayer() async {
    String asset = SoundManager.getSoundAssetForPlatform('some music to play');
    await audioPlayer.setAsset(asset);
    await audioPlayer.setLoopMode(LoopMode.one);
  }

  void _playTrack(Duration timeStamp) async {
    await _setupAudioPlayer();
    audioPlayer.play();
  }

Are these still known issues or is there a better way to achieve 100% looping/gapless playback with minimal resources ?

Additionally - I have tested on simulators and live devices (iPhone X, iPhone 11 Max pro, iPhone SE, iPhone 7, iPhone 7+, iPad Pro, iPad Air), Samsung 7/8/9/10 - all exhibit the same behaviour.

ryanheise commented 4 years ago

Hi @alexda12 and thanks for your comment!

I have been busy for a couple of weeks working on another issue related to audio focus which I'll make another comment on, however to answer your question, I am not sure if there is anything more I can do on the Android side since gapless playback is actually implemented, based on what ExoPlayer provides. Therefore, you might want to check on the ExoPlayer GitHub issues page to see if there is any underlying issue at the ExoPlayer level. Sometimes there are known issues with particular audio formats or other particular usages, and sometimes they can be worked around.

As for iOS, I haven't implemented gapless looping yet. It's on the todo list, but I have a mountain of issues to get through, so I basically have to prioritise things in that list. I think that since LoopingAudioSource should at least be gapless, this should provide at least an almost adequate workaround in the meantime.

ryanheise commented 4 years ago

The new feature mentioned above that I have been working on is actually a new plugin called audio_session. You can read its README to get an understanding of what it is all about, and it came about partly due to a discussion earlier in this thread about how audio plugins in the Flutter ecosystem currently tend to overwrite each other's audio settings, and also tend to provide duplicate APIs for essentially doing the same thing. It made sense to create a separate plugin that manages your audio session. On iOS, this interfaces with AVAudioSession, and on Android it interfaces with AudioManager.

Now, during your app's initialisation, you should do something like this:

final session = await AudioSession.instance; // also works in the background
session.configure(AudioSessionConfiguration.music());

where music is one pre-made configuration recipe that chooses appropriate settings for a music app. There is also another recipe called speech that provides suitable settings across iOS/Android for podcast/audiobook apps. And you can also create your own custom configuration from scratch if your app has special requirements related to how the app should interact with other audio apps that may be playing at the same time.

I have updated the code for just_audio on the audio-session branch to support this new plugin, and hope to merge this soon. One reason for wanting to merge it soon is so that I can update the audio_session example to use it. It's a chicken and egg problem :-)

alexda12 commented 4 years ago

Hi @alexda12 and thanks for your comment!

I have been busy for a couple of weeks working on another issue related to audio focus which I'll make another comment on, however to answer your question, I am not sure if there is anything more I can do on the Android side since gapless playback is actually implemented, based on what ExoPlayer provides. Therefore, you might want to check on the ExoPlayer GitHub issues page to see if there is any underlying issue at the ExoPlayer level. Sometimes there are known issues with particular audio formats or other particular usages, and sometimes they can be worked around.

As for iOS, I haven't implemented gapless looping yet. It's on the todo list, but I have a mountain of issues to get through, so I basically have to prioritise things in that list. I think that since LoopingAudioSource should at least be gapless, this should provide at least an almost adequate workaround in the meantime.

Thanks Ryan - I will implement 'LoopingAudioSource' and see if that improves things on the iOS side .. Yes I've also read that Android suffers for a long time with gapless playback and heard recommendations that ogg formats make things somewhat better (I'm using AAC in Android and MP3 in iOS) ...

The AudioSession will actually help me in my app as I'm using other plugins for sound effects, TTS & theme music and yes - I have noticed that they do overwrite each others settings not to mention bloating my app with duplicated API's - I've only just stumbled on to just_audio and I'm hoping it will provide 100% of my requirements so that I can discontinue with the other plugins and have 1 unified plugin to handle all my audio needs ...

alexda12 commented 4 years ago

Hi @ryanheise - trying to implement LoopingAudioSource throws a FileNotFound exception 'assets/sound/themeMusic.aac'

This works


   var concat = ConcatenatingAudioSource(
      children: [
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
      ],
    );

    audioPlayer.load(concat);

This fails

var concat = ConcatenatingAudioSource(
      children: [
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
        AudioSource.uri(Uri.parse("asset:///$asset")),
      ],
    );

    audioPlayer.load(LoopingAudioSource(child: concat, count: 1));
ryanheise commented 4 years ago

@alexda12 would you mind providing a minimal reproduction project with assets included? This will help me to investigate.

While I understand that your example above is probably intentionally contrived, I do want to just point out that it is unusual to have a looping audio source with a count of "1" ;-) Although, it does have the redeeming quality of revealing a bug which I otherwise wouldn't have known about.

Less obviously, I would point out that it is not useful to concatenate the same file 4 times with ConcatenatingAudioSource because this is exactly what LoopingAudioSource does internally for iOS anyway. You can try this instead:

await audioPlayer.load(LoopingAudioSource(
  child: AudioSource.uri(Uri.parse("asset:///$asset")),
  count: 10,
));
await audioPlayer.setLoopMode(LoopMode.all);

This way, you get a gap once every 10 iterations.

alexda12 commented 4 years ago

@ryanheise

Sorry for delay in getting back , completely missed this notification.

Yes, the last example I provided was just a simple code fragment to show the use of ConcatenatingAudioSource working in harmony with Uri.parse and not throwing an error as opposed to using the LoopingAudioSource which does throw an error.

My example is pretty basic - prior to try and get Looping working, I simple have this a very simple Audio helper class that is defined simply as :

class ThemeMusicHelper {
  final AudioPlayer audioPlayer = AudioPlayer();

  Future<void> _setupAudioPlayer() async {
    String asset = SoundManager.getSoundAssetForPlatform('gameTheme');
 /// will return either /assets/sound/gameTheme.aac or /assets/sound/gameTheme.mp3 based on the platform

    await audioPlayer.setAsset(asset);
    await audioPlayer.setLoopMode(LoopMode.one);
  }

  void playTrack(Duration timeStamp) async {
    await _setupAudioPlayer();
    audioPlayer.play();
  }
}

ThemeMusicHelper is accessed within any Stateful widget via a mixin such as :


class _MyCustomWidgetUsingAudioState extends State<MyCustomWidgetUsingAudio>
    with
        WidgetsBindingObserver,
        SingleTickerProviderStateMixin,
        ThemeMusicHelper {
....
}

yes - we also dispose of AudioPlayer ...

And finally my playTrack gets triggered via : WidgetsBinding.instance.addPostFrameCallback(playTrack);

All of this works seamlessly in both iOS and Android. There is no issue . The problem is when we try and add the simple LoopingAudioSource - we get an exception in Android that the asset could not be found ....

The

await audioPlayer.setAsset(asset);

is just a simple helper for :

load(AudioSource.uri(Uri.parse('asset:///$assetPath')));

But :

(AudioSource.uri(Uri.parse('asset:///$assetPath')));

used within a LoopingAudioSource appears to fall over....

Note , this was tested on Android and falls over in Android with the File not found, In iOS , we do not get any exception - just that no audio is played ....

For various reasons (Close to release to app store and we have ironed out all issues for the past 6 months in this version) I am using 1.12.13 hotfix 5.

We are not sure if LoopingAudioSource has issues with schemes other than File, Http etc , since we are using asset

ryanheise commented 4 years ago

@alexda12 did you try the code I suggested? If it doesn't work (and even if it does work), I would still appreciate a minimal reproduction project as mentioned above. You can submit one via the "New issue" button. This will be the fastest way for me to investigate the bug and fix it.

alexda12 commented 4 years ago

@ryanheise - yes I tried the code and the issue is the Url is rejected when you send a Parsed Asset Uri to AudioSource and then inject this into LoopingAudioSource . The asset should be converted to a logical file handle (as can be seen if you debug the source.toJson()) call in _load ...

I will try and package up a reproducible test case later today

image

ryanheise commented 4 years ago

Your description and screenshot gives me an idea of what it could be. I'll have a quick attempt at a fix for you...