Open andrea689 opened 2 years ago
You didn't follow the instructions for submitting a minimal steps reproduction project. I will need the link.
@ryanheise sorry, this is the link: https://github.com/andrea689/just_audio_ios_error
For sanity, can you try rewriting the same example but hosting the remote file in WAV format rather than JSON? That will make it easier to confirm whether you have valid or invalid audio data.
P.S. I don't know why Android needs
android:usesCleartextTraffic="true"
to work
Because just_audio creates a proxy on http://localhost:.... to serve stream audio sources and that 'http' rather than 'https' requires the android:usesCleartextTraffic
option.
@ryanheise I updated the repo
I haven't figured out why it doesn't work yet, however I have discovered that your code will work if you use mp3 instead of wav, so there might be a workaround you could use in the meantime.
@ryanheise unfortunately I only have wav samples.. thanks anyway!
You can't convert those wav files to MP3 using ffmpeg or similar?
I should change the endpoint that generates the wav and currently I can't.
Do you think this is a problem that you will be able to solve?
Otherwise I would have to use flutter_sound
for iOS and just_audio
for Android, but I would like to use only one library.
Until now I have been using flutter_sound
which is no longer maintained, so I was migrating to just_audio
due to a crash problem in some Android devices (https://github.com/Canardoux/flutter_sound/issues/780)
Another workaround that should work is to download the json, reconstruct the raw byte data, write that to a file with a .wav
filename extension, and then use AudioSource.uri
with Uri.file(filePath)
.
ok, now I try it, thanks
It works!
I decided to write to file for Android and iOS, and leave the byte array in the web. This way, no http proxy is needed on Android.
Many thanks!
Glad to hear.
Let's still keep this issue open, though, since I will eventually want to look into why StreamAudioSource
isn't working with wav content.
@andrea689 I got the same problem, but I am using setUrl
method. I fixed it by adding byte range
to my request on backend (backend needs to add it).
This is the part from package documentation that I am referring to:
The iOS player relies on server headers (e.g. Content-Type, Content-Length and byte range requests) to know how to decode the file and where applicable to report its duration. In the case of files, iOS relies on the file extension.
I just ran into this issue as well with AAC files converted with FFMPEG. As you said, it works with MP3 but for me it even works the original WAV file. Here are some sample files you could use to recreate the issue: example-files.zip
As previously mentioned, it works on Android but not on iOS.
I'm trying to protect the file by storing it in a password protected ZIP file and then read the stream from the archive, so I prefer not to unpack the archive and store a temporary file somewhere, even if that would be a functional workaround.
If AAC could work, I'd prefer that over using MP3.
Thanks for providing the test files. I don't have any answers yet as to why this is happening because the proxy headers, including the content type, all looked right to me last time I investigated. Have you tested if your files work fine when pulled directly from some server URL? If that works, it's a matter of comparing the HTTP headers of that server with the headers the proxy generates to see where it's going wrong.
You mean just something like this?
audioPlayer.setUrl('http://localhost:8000/boxaac.m4a');
audioPlayer.play();
If so, then yes, it works.
If it helps, this seems to be the line of code where the library runs into the error: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L784
You mean just something like this?
audioPlayer.setUrl('http://localhost:8000/boxaac.m4a'); audioPlayer.play();
If so, then yes, it works.
Wait, what server is that? If that's the proxy itself, then that's certainly not what I meant because in that case there would be no expected difference in headers. Although if it is the proxy you are testing, it is surprising to hear that it works with setUrl
.
No, it's just a locally hosted web server to try to stream the file with setUrl
. Instead of publishing it online I found it easier to do that.
In that case, I still can't connect to it and check the headers myself. Can you?
I'll see if I can find the headers you're looking for, meanwhile you might want to test e.g. this URL I found when searching GitHub for .m4a. It behaves the same way for me. I can get that URL to play directly in just_audio
using setUrl
but if I download it and use a custom StreamAudioSource
to play it, it won't work.
Not entirely sure what headers you want, but if you point me towards the point in the code you want to check the variables I could do that.
Another interesting finding is that if I change contentType
in StreamAudioResponse
to contentType: 'audio/wav',
instead, the m4a file plays as it should. Setting it to 'audio/aac'
or any other format throws the same error as before.
In the code, you can print out the proxy's headers in _proxyHandlerForSource
. Then we want to compare those headers with another web server that works. If it's a public web server, I would generally use curl
see what headers come back in the response.
It is interesting why putting the wrong content type would cause it to work.
Not sure I fully understand what you are after, but I made a breakpoint here: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L3020
That gave the following output from the header variable in the request response. The two continued to look the same the second and third break, then the aac
version failed whereas the wav
version ran a fourth time and then started playing.
aac
_HttpHeaders (content-type: audio/aac
set-cookie: DARTSESSID=5d318dc2d814d2798f736cacf7f3226e; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)
wav
_HttpHeaders (content-type: audio/wav
set-cookie: DARTSESSID=bcccc9a2cce788738dd66e890a22f4a7; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)
Server: Apache/2.4.51 (Unix) OpenSSL/1.1.1k PHP/8.0.12 mod_perl/2.0.11 Perl/v5.32.1
Last-Modified: Tue, 19 Apr 2022 07:17:36 GMT
ETag: "701e1e-5dcfcabd0a400"
Accept-Ranges: bytes
Content-Length: 7347742
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
When I had done this, I checked the GitHub URL I posted earlier which returned the following:
Connection: keep-alive
Content-Length: 65407
Cache-Control: max-age=300
content-disposition: attachment; filename=sounds/Beta_m4a/samples/BassWumm_A.m4a
Content-Security-Policy: default-src 'none'; style-src 'unsafe-inline'; sandbox
Content-Type: audio/mp4
ETag: W/"90bee47ac11adb72b15cc1d8018a51c21380a04185237567fb4d8bd6e44e9ca2"
Strict-Transport-Security: max-age=31536000
X-Content-Type-Options: nosniff
X-Frame-Options: deny
X-XSS-Protection: 1; mode=block
X-GitHub-Request-Id: 7498:0E31:B7FC1:EFCE8:625E7AA4
Accept-Ranges: bytes
Date: Tue, 19 Apr 2022 09:02:28 GMT
Via: 1.1 varnish
X-Served-By: cache-bma1633-BMA
X-Cache: MISS
X-Cache-Hits: 0
X-Timer: S1650358948.462872,VS0,VE401
Vary: Authorization,Accept-Encoding,Origin
Access-Control-Allow-Origin: *
X-Fastly-Request-ID: 077f796d939110411cb917232a21e4798809d130
Expires: Tue, 19 Apr 2022 09:07:28 GMT
Source-Age: 0
That made me realize that the correct way to write the MIME type of m4a
files is audio/mp4
and using that works for me with just_audio
. audio/aac
is apparently only for streams (ADTS).
This is in other words no longer an issue for me in my current setup, so I'll leave further investigation to you.
Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error:
PlayerException ((-11850) Operation Stopped)
P.S. I don't know why Android needs
android:usesCleartextTraffic="true"
to work
FYI, I have just updated the iOS setup documentation in the README with the correct documentation for the iOS equivalent of usesCleartextTraffic
. I think this section was originally correct but then I added another option last year which is for iOS 10+ which turns of the other option, but you will actually get the correct behaviour on all versions if you use the older iOS 9 option. Details are in the README and the official example's Info.plist
.
What the hell. I am streaming AAC with mp4 container. (audio/mp4) I spent almost 4 nights trying to figure out why the player is not working on iOS. After setting the MIME type to audio/mp3 (it's still not an mp3) it suddenly works (almost) perfectly???
@55nknown are you using a feature that enables the proxy, such as HTTP headers or LockCachingAudioSource
or StreamAudioSource
?
I am using StreamAudioSource
I'm running into the same issue, .m4a
plays fine before running it through ffmpeg
, fails after.
In case another example is at all helpful, here's the command (for debugging purposes I've trimmed it down to just decode and re-encode):
-i "var/mobile/.../recording_2022_10_02_24527.m4a" var/mobile/.../recording_2022_10_02_24527_denoised.m4a
Here are the files: testing clips.zip
Let me know if there is anything else I can do to help! In the meantime I'll use a streaming audio source instead of setFile
and manually specify the content type as others have done above.
@caseycrogers if you're using setFile
, then you have a different issue because this issue is about a problem that occurs when using StreamAudioSource
. When using setFile
, you are depending on iOS's method of using the file extension to determine the file type. just_audio doesn't have a say in what iOS does there, so you would need to read the iOS documentation to see what filename extensions it recognises for what types.
Both clips are .m4a
so the issue must not be dependent on the file extension?
All the same, I'm having trouble finding relevant iOS documentation (Apple is very bad at this). I'm digging through just audio's source code to find a spot where I can add a breakpoint to check the type just audio is receiving and I'm not finding one, could you point me to a spot to check this and I'll come back with the values?
Edit:
also I piled onto this issue instead of creating a new one because my problem sounds very similar to mt633's: can't play an m4a
exported from ffmpeg
. However, if you think this is a different issue and it makes things easier, happy to file a fresh issue for this.
As explained above, when dealing with setFile
, you are experiencing the direct behaviour of iOS itself. There's nothing I can do about that, so you will just need to consult the iOS documentation. Just because two files have the same extension does not mean they will both be handled equally well by iOS, since the two files may be encoded differently and iOS may have different support for different encodings. Consult iOS's documentation for that matter.
I have the same problem with StreamAudioSource not working on iOS. I have tried various packages and they all have similar issues. However, I found https://pub.dev/packages/audiofileplayer that works on Android and iOS. It has a loadFromByteData function and it works on Android and iOS. This might give you a clue to solve the problem.
There are no plans to use that type of solution since transmitting large byte arrays over method channels can potentially lock up the UI. However, the particular use case of playing audio from a byte array has other solutions besides StreamAudioSource
, such as https://github.com/ryanheise/just_audio/issues/685#issuecomment-1069188124
I had the same problem when storing audio file in database as byte array. I was using ASP.NET Core to return byte array for specific file ID. Here are my attempts when it comes to ASP.NET Core...
Endpoint: /files/1
Server-side code:
var file = ...;// get from database
return File(file.Content);
Status: NOT WORKING
I read that iOS needs file extension inside URL, as well as content type so I added both.
Endpoint: /files/1.wav
Server-side code:
var file = ...;// get from database
return File(file.Content, 'audio/wav');
Status: NOT WORKING
After reading all these comments I tried to do something with byte range. There is a third parameter in File constructor which is
enableRangeProcessing
. Let's try setting it to true.
Endpoint: /files/1.wav
Server-side code:
var file = ...;// get from database
return File(file.Content, 'audio/wav', true); // True is for enableRangeProcessing
Status: WORKING!!!
So, I can confirm that extension, content type, content length and byte range needs to present in order for it to work.
Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error:
PlayerException ((-11850) Operation Stopped)
P.S. I don't know why Android needs
android:usesCleartextTraffic="true"
to workMinimal reproduction project
https://github.com/andrea689/just_audio_ios_error
main.dart
```dart import 'dart:convert'; import 'dart:typed_data'; import 'package:collection/collection.dart'; import 'package:flutter/material.dart'; import 'package:http/http.dart' as http; import 'package:just_audio/just_audio.dart'; void main() { runApp(const MyApp()); } class MyApp extends StatelessWidget { const MyApp({Key? key}) : super(key: key); // This widget is the root of your application. @override Widget build(BuildContext context) { return const MaterialApp( home: MyHomePage(), ); } } class MyHomePage extends StatelessWidget { const MyHomePage({Key? key}) : super(key: key); @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: const Text('Sound Test'), ), body: Center( child: FutureBuilderTo Reproduce (i.e. user steps, not code) Steps to reproduce the behavior:
Error messages
Expected behavior Correct audio playback
Smartphone (please complete the following information):
Flutter SDK version