ryanheise / just_audio

Audio Player
1.06k stars 680 forks source link

[iOS][StreamAudioSource] PlayerException ((-11850) Operation Stopped) #685

Open andrea689 opened 2 years ago

andrea689 commented 2 years ago

Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error: PlayerException ((-11850) Operation Stopped)

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

Minimal reproduction project

https://github.com/andrea689/just_audio_ios_error

main.dart ```dart import 'dart:convert'; import 'dart:typed_data'; import 'package:collection/collection.dart'; import 'package:flutter/material.dart'; import 'package:http/http.dart' as http; import 'package:just_audio/just_audio.dart'; void main() { runApp(const MyApp()); } class MyApp extends StatelessWidget { const MyApp({Key? key}) : super(key: key); // This widget is the root of your application. @override Widget build(BuildContext context) { return const MaterialApp( home: MyHomePage(), ); } } class MyHomePage extends StatelessWidget { const MyHomePage({Key? key}) : super(key: key); @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: const Text('Sound Test'), ), body: Center( child: FutureBuilder( future: http.get( Uri.parse('https://filebin.net/4i2f18nheahilka7/audio.json')), builder: (context, snapshot) { if (snapshot.hasData) { final dataBuffer = Uint8List.fromList( List.from(jsonDecode(snapshot.data!.body)['bytes'])); return SoundPlayerUI(dataBuffer: dataBuffer); } return const CircularProgressIndicator(); }, ), ), ); } } class SoundPlayerUI extends StatefulWidget { final Uint8List dataBuffer; const SoundPlayerUI({ Key? key, required this.dataBuffer, }) : super(key: key); @override State createState() => _SoundPlayerUIState(); } class _SoundPlayerUIState extends State { late AudioPlayer _audioPlayer; Duration duration = const Duration(); @override void initState() { super.initState(); _audioPlayer = AudioPlayer(); _audioPlayer .setAudioSource(MyAudioSource(widget.dataBuffer)) .then((value) => setState(() => duration = value ?? const Duration())) .catchError((error) { // catch load errors: 404, invalid url ... print("An error occured $error"); }); } @override void dispose() { _audioPlayer.dispose(); super.dispose(); } String _printDuration(Duration duration) { String twoDigits(int n) => n.toString().padLeft(2, "0"); String twoDigitMinutes = twoDigits(duration.inMinutes.remainder(60)); String twoDigitSeconds = twoDigits(duration.inSeconds.remainder(60)); return "$twoDigitMinutes:$twoDigitSeconds"; } @override Widget build(BuildContext context) { return Card( child: Row( children: [ StreamBuilder( stream: _audioPlayer.playerStateStream, builder: (_, snapshot) { final processingState = snapshot.data?.processingState; if (processingState == ProcessingState.loading || processingState == ProcessingState.buffering) { return Center( child: Container( margin: const EdgeInsets.all(12), width: 24, height: 24, child: const CircularProgressIndicator(), ), ); } if (_audioPlayer.playing == false) { return IconButton( icon: const Icon(Icons.play_arrow), color: Theme.of(context).colorScheme.primary, onPressed: () { _audioPlayer.play(); }, ); } if (processingState != ProcessingState.completed) { return IconButton( icon: const Icon(Icons.pause), color: Theme.of(context).colorScheme.primary, onPressed: () { _audioPlayer.pause(); }, ); } return IconButton( icon: const Icon(Icons.replay), color: Theme.of(context).colorScheme.primary, onPressed: () { _audioPlayer.stop(); _audioPlayer.seek( Duration.zero, index: _audioPlayer.effectiveIndices?.firstOrNull, ); _audioPlayer.play(); }, ); }, ), Expanded( child: StreamBuilder( stream: _audioPlayer.positionStream, builder: (context, snapshot) { final currentDuration = snapshot.data ?? const Duration(); final totalDuration = duration.inMilliseconds == 0 ? 1 : duration.inMilliseconds; final position = currentDuration.inMilliseconds / totalDuration; return Row( children: [ Text( '${_printDuration(currentDuration)} / ${_printDuration(duration)}', ), const SizedBox(width: 16), Expanded( child: ClipRRect( borderRadius: const BorderRadius.all(Radius.circular(10)), child: LinearProgressIndicator( value: position, minHeight: 6, ), ), ), const SizedBox(width: 16), ], ); }, ), ), ], ), ); } } class MyAudioSource extends StreamAudioSource { final Uint8List _buffer; MyAudioSource(this._buffer) : super(tag: 'MyAudioSource'); @override Future request([int? start, int? end]) async { // Returning the stream audio response with the parameters return StreamAudioResponse( sourceLength: _buffer.length, contentLength: (start ?? 0) - (end ?? _buffer.length), offset: start ?? 0, stream: Stream.fromIterable([_buffer.sublist(start ?? 0, end)]), contentType: 'audio/wav', ); } } ```

To Reproduce (i.e. user steps, not code) Steps to reproduce the behavior:

  1. Open app

Error messages

PlayerException ((-11850) Operation Stopped)

Expected behavior Correct audio playback

Smartphone (please complete the following information):

Flutter SDK version

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 2.10.1, on macOS 11.6 20G165 darwin-x64, locale en-GB)
[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.2)
[!] Xcode - develop for iOS and macOS (Xcode 12.5.1)
    ! Flutter recommends a minimum Xcode version of 13.
      Download the latest version or update via the Mac App Store.
[✓] Chrome - develop for the web
[✓] Android Studio (version 2021.1)
[✓] VS Code (version 1.65.2)
[✓] Connected device (5 available)
[✓] HTTP Host Availability

! Doctor found issues in 1 category.
ryanheise commented 2 years ago

You didn't follow the instructions for submitting a minimal steps reproduction project. I will need the link.

andrea689 commented 2 years ago

@ryanheise sorry, this is the link: https://github.com/andrea689/just_audio_ios_error

ryanheise commented 2 years ago

For sanity, can you try rewriting the same example but hosting the remote file in WAV format rather than JSON? That will make it easier to confirm whether you have valid or invalid audio data.

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

Because just_audio creates a proxy on http://localhost:.... to serve stream audio sources and that 'http' rather than 'https' requires the android:usesCleartextTraffic option.

andrea689 commented 2 years ago

@ryanheise I updated the repo

ryanheise commented 2 years ago

I haven't figured out why it doesn't work yet, however I have discovered that your code will work if you use mp3 instead of wav, so there might be a workaround you could use in the meantime.

andrea689 commented 2 years ago

@ryanheise unfortunately I only have wav samples.. thanks anyway!

ryanheise commented 2 years ago

You can't convert those wav files to MP3 using ffmpeg or similar?

andrea689 commented 2 years ago

I should change the endpoint that generates the wav and currently I can't.

Do you think this is a problem that you will be able to solve?

Otherwise I would have to use flutter_sound for iOS and just_audio for Android, but I would like to use only one library.

Until now I have been using flutter_sound which is no longer maintained, so I was migrating to just_audio due to a crash problem in some Android devices (https://github.com/Canardoux/flutter_sound/issues/780)

ryanheise commented 2 years ago

Another workaround that should work is to download the json, reconstruct the raw byte data, write that to a file with a .wav filename extension, and then use AudioSource.uri with Uri.file(filePath).

andrea689 commented 2 years ago

ok, now I try it, thanks

andrea689 commented 2 years ago

It works!

I decided to write to file for Android and iOS, and leave the byte array in the web. This way, no http proxy is needed on Android.

Many thanks!

ryanheise commented 2 years ago

Glad to hear.

Let's still keep this issue open, though, since I will eventually want to look into why StreamAudioSource isn't working with wav content.

MyisCARRY commented 2 years ago

@andrea689 I got the same problem, but I am using setUrl method. I fixed it by adding byte range to my request on backend (backend needs to add it).

This is the part from package documentation that I am referring to:

The iOS player relies on server headers (e.g. Content-Type, Content-Length and byte range requests) to know how to decode the file and where applicable to report its duration. In the case of files, iOS relies on the file extension.

mt633 commented 2 years ago

I just ran into this issue as well with AAC files converted with FFMPEG. As you said, it works with MP3 but for me it even works the original WAV file. Here are some sample files you could use to recreate the issue: example-files.zip

As previously mentioned, it works on Android but not on iOS.

I'm trying to protect the file by storing it in a password protected ZIP file and then read the stream from the archive, so I prefer not to unpack the archive and store a temporary file somewhere, even if that would be a functional workaround.

If AAC could work, I'd prefer that over using MP3.

ryanheise commented 2 years ago

Thanks for providing the test files. I don't have any answers yet as to why this is happening because the proxy headers, including the content type, all looked right to me last time I investigated. Have you tested if your files work fine when pulled directly from some server URL? If that works, it's a matter of comparing the HTTP headers of that server with the headers the proxy generates to see where it's going wrong.

mt633 commented 2 years ago

You mean just something like this?

audioPlayer.setUrl('http://localhost:8000/boxaac.m4a');
audioPlayer.play();

If so, then yes, it works.

mt633 commented 2 years ago

If it helps, this seems to be the line of code where the library runs into the error: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L784

ryanheise commented 2 years ago

You mean just something like this?

audioPlayer.setUrl('http://localhost:8000/boxaac.m4a');
audioPlayer.play();

If so, then yes, it works.

Wait, what server is that? If that's the proxy itself, then that's certainly not what I meant because in that case there would be no expected difference in headers. Although if it is the proxy you are testing, it is surprising to hear that it works with setUrl.

mt633 commented 2 years ago

No, it's just a locally hosted web server to try to stream the file with setUrl. Instead of publishing it online I found it easier to do that.

ryanheise commented 2 years ago

In that case, I still can't connect to it and check the headers myself. Can you?

mt633 commented 2 years ago

I'll see if I can find the headers you're looking for, meanwhile you might want to test e.g. this URL I found when searching GitHub for .m4a. It behaves the same way for me. I can get that URL to play directly in just_audio using setUrl but if I download it and use a custom StreamAudioSource to play it, it won't work.

mt633 commented 2 years ago

Not entirely sure what headers you want, but if you point me towards the point in the code you want to check the variables I could do that.

Another interesting finding is that if I change contentType in StreamAudioResponse to contentType: 'audio/wav', instead, the m4a file plays as it should. Setting it to 'audio/aac' or any other format throws the same error as before.

ryanheise commented 2 years ago

In the code, you can print out the proxy's headers in _proxyHandlerForSource. Then we want to compare those headers with another web server that works. If it's a public web server, I would generally use curl see what headers come back in the response.

It is interesting why putting the wrong content type would cause it to work.

mt633 commented 2 years ago

Not sure I fully understand what you are after, but I made a breakpoint here: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L3020

That gave the following output from the header variable in the request response. The two continued to look the same the second and third break, then the aac version failed whereas the wav version ran a fourth time and then started playing.

With content type aac

_HttpHeaders (content-type: audio/aac
set-cookie: DARTSESSID=5d318dc2d814d2798f736cacf7f3226e; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)

With content type wav

_HttpHeaders (content-type: audio/wav
set-cookie: DARTSESSID=bcccc9a2cce788738dd66e890a22f4a7; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)

Web server headers

Server: Apache/2.4.51 (Unix) OpenSSL/1.1.1k PHP/8.0.12 mod_perl/2.0.11 Perl/v5.32.1
Last-Modified: Tue, 19 Apr 2022 07:17:36 GMT
ETag: "701e1e-5dcfcabd0a400"
Accept-Ranges: bytes
Content-Length: 7347742
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive

When I had done this, I checked the GitHub URL I posted earlier which returned the following:

Connection: keep-alive
Content-Length: 65407
Cache-Control: max-age=300
content-disposition: attachment; filename=sounds/Beta_m4a/samples/BassWumm_A.m4a
Content-Security-Policy: default-src 'none'; style-src 'unsafe-inline'; sandbox
Content-Type: audio/mp4
ETag: W/"90bee47ac11adb72b15cc1d8018a51c21380a04185237567fb4d8bd6e44e9ca2"
Strict-Transport-Security: max-age=31536000
X-Content-Type-Options: nosniff
X-Frame-Options: deny
X-XSS-Protection: 1; mode=block
X-GitHub-Request-Id: 7498:0E31:B7FC1:EFCE8:625E7AA4
Accept-Ranges: bytes
Date: Tue, 19 Apr 2022 09:02:28 GMT
Via: 1.1 varnish
X-Served-By: cache-bma1633-BMA
X-Cache: MISS
X-Cache-Hits: 0
X-Timer: S1650358948.462872,VS0,VE401
Vary: Authorization,Accept-Encoding,Origin
Access-Control-Allow-Origin: *
X-Fastly-Request-ID: 077f796d939110411cb917232a21e4798809d130
Expires: Tue, 19 Apr 2022 09:07:28 GMT
Source-Age: 0

That made me realize that the correct way to write the MIME type of m4a files is audio/mp4 and using that works for me with just_audio. audio/aac is apparently only for streams (ADTS).

This is in other words no longer an issue for me in my current setup, so I'll leave further investigation to you.

ryanheise commented 2 years ago

Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error: PlayerException ((-11850) Operation Stopped)

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

FYI, I have just updated the iOS setup documentation in the README with the correct documentation for the iOS equivalent of usesCleartextTraffic. I think this section was originally correct but then I added another option last year which is for iOS 10+ which turns of the other option, but you will actually get the correct behaviour on all versions if you use the older iOS 9 option. Details are in the README and the official example's Info.plist.

cameralis commented 2 years ago

What the hell. I am streaming AAC with mp4 container. (audio/mp4) I spent almost 4 nights trying to figure out why the player is not working on iOS. After setting the MIME type to audio/mp3 (it's still not an mp3) it suddenly works (almost) perfectly???

ryanheise commented 2 years ago

@55nknown are you using a feature that enables the proxy, such as HTTP headers or LockCachingAudioSource or StreamAudioSource?

cameralis commented 2 years ago

I am using StreamAudioSource

caseycrogers commented 2 years ago

I'm running into the same issue, .m4a plays fine before running it through ffmpeg, fails after.

In case another example is at all helpful, here's the command (for debugging purposes I've trimmed it down to just decode and re-encode): -i "var/mobile/.../recording_2022_10_02_24527.m4a" var/mobile/.../recording_2022_10_02_24527_denoised.m4a

Here are the files: testing clips.zip

Let me know if there is anything else I can do to help! In the meantime I'll use a streaming audio source instead of setFile and manually specify the content type as others have done above.

ryanheise commented 2 years ago

@caseycrogers if you're using setFile, then you have a different issue because this issue is about a problem that occurs when using StreamAudioSource. When using setFile, you are depending on iOS's method of using the file extension to determine the file type. just_audio doesn't have a say in what iOS does there, so you would need to read the iOS documentation to see what filename extensions it recognises for what types.

caseycrogers commented 2 years ago

Both clips are .m4a so the issue must not be dependent on the file extension?

All the same, I'm having trouble finding relevant iOS documentation (Apple is very bad at this). I'm digging through just audio's source code to find a spot where I can add a breakpoint to check the type just audio is receiving and I'm not finding one, could you point me to a spot to check this and I'll come back with the values?

Edit: also I piled onto this issue instead of creating a new one because my problem sounds very similar to mt633's: can't play an m4a exported from ffmpeg. However, if you think this is a different issue and it makes things easier, happy to file a fresh issue for this.

ryanheise commented 2 years ago

As explained above, when dealing with setFile, you are experiencing the direct behaviour of iOS itself. There's nothing I can do about that, so you will just need to consult the iOS documentation. Just because two files have the same extension does not mean they will both be handled equally well by iOS, since the two files may be encoded differently and iOS may have different support for different encodings. Consult iOS's documentation for that matter.

lcw99 commented 1 year ago

I have the same problem with StreamAudioSource not working on iOS. I have tried various packages and they all have similar issues. However, I found https://pub.dev/packages/audiofileplayer that works on Android and iOS. It has a loadFromByteData function and it works on Android and iOS. This might give you a clue to solve the problem.

ryanheise commented 1 year ago

There are no plans to use that type of solution since transmitting large byte arrays over method channels can potentially lock up the UI. However, the particular use case of playing audio from a byte array has other solutions besides StreamAudioSource, such as https://github.com/ryanheise/just_audio/issues/685#issuecomment-1069188124

anisalibegic commented 2 weeks ago

I had the same problem when storing audio file in database as byte array. I was using ASP.NET Core to return byte array for specific file ID. Here are my attempts when it comes to ASP.NET Core...

First attempt

Endpoint: /files/1 Server-side code:

var file = ...;// get from database
return File(file.Content);

Status: NOT WORKING


Second attempt

I read that iOS needs file extension inside URL, as well as content type so I added both.

Endpoint: /files/1.wav Server-side code:

var file = ...;// get from database
return File(file.Content, 'audio/wav');

Status: NOT WORKING


Third attempt

After reading all these comments I tried to do something with byte range. There is a third parameter in File constructor which is enableRangeProcessing. Let's try setting it to true.

Endpoint: /files/1.wav Server-side code:

var file = ...;// get from database
return File(file.Content, 'audio/wav', true); // True is for enableRangeProcessing

Status: WORKING!!!

So, I can confirm that extension, content type, content length and byte range needs to present in order for it to work.