ryanheise / just_audio

Audio Player
1.03k stars 654 forks source link

Improve StreamAudioSource Documentation #531

Open dhushyanth-s opened 2 years ago

dhushyanth-s commented 2 years ago

To which pages does your suggestion apply?

I wasn't able to find documentation for StreamAudioSource except the pub.dev documentation page

Quote the sentences(s) from the documentation to be improved (if any)

N/A

Describe your suggestion

By reading the existing documentation, I was able to put together a simple implementation

class StreamSource extends StreamAudioSource {
  String uri;
  String fileType;

  // Get the Android content uri and the corresponsing file type by using MediaStore API in android
  StreamSource(this.uri, this.fileType);

  @override
  Future<StreamAudioResponse> request([int? start, int? end]) async {
    // Use a method channel to read the file into a List of bytes
    const channel = MethodChannel("package/Main");
    var file = await channel.invokeMethod("getContentsFromUri", {"uri": uri});
    var fileBytes = file as List<int>;

    // Returning the stream audio response with the parameters
    return StreamAudioResponse(
      sourceLength: fileBytes.length,
      contentLength: (start ?? 0) - (end ?? fileBytes.length),
      offset: start ?? 0,
      stream: Stream.fromIterable([fileBytes.sublist(start ?? 0, end)]),
      contentType: mimeTypes[fileType]!,
    );
  }
}

The above implementation works well but I would like some documentation on how to implement things properly, especially when we have a stream of bytes Stream<UInt8> or Stream<Int>. If there is documentation present and I just didn't see it, please provide the link and close the issue.

ryanheise commented 2 years ago

I don't intend to put an example in the documentation page in the short term as I think examples are best placed in the examples directory. However, I can improve the documentation if it is confusing. Which part is unclear? Can you quote the part?

dhushyanth-s commented 2 years ago

Many other popular packages have one or two code snippets in the documentation to ease up the developer's custom implementation, especially for the abstract classes. Examples directory is a good place for examples but it needs the developers to read the surrounding code and context too to understand the implementation.

Anyways, the part where I got confused was

  1. Why was the content length and offset needed if start and end were known from the method calling context itself? Isn't contentLength simply end - start or corresponding limits' difference?
  2. Why is the stream typed Stream<List<int>> instead of Stream<int>
  3. Is there any other class I can use that will just play from a byte array List<int>. A kind of default implementation of the StreamAudioSource. This I figured out by simply going through documentation so not a priority.
ryanheise commented 2 years ago

Thanks, I'll try to incorporate some of that in the docs in the future. If you're not already familiar with byte range requests and chunked I/O, Google may be your friend in the meantime until I can give examples.

Your third point is probably not what you want in consideration of I/O chunking, but I suppose if the file is small enough you could try putting the whole thing into one chunk.

jemisgoti commented 2 years ago

i am receiving pcm16 stream from mic source. I want to play pcm16 stream live using StreamAudioSource .how to do this?

ryanheise commented 2 years ago

@jemisgoti with StreamAudioSource you basically just need to write some code to output a stream of encoded bytes, in one of the platform's supported audio encodings, not raw PCM audio.

just_audio doesn't provide any encoders so you will need to search for some other package to help you with the encoding part since your audio isn't already encoded, but once you have an encoder you can implement your subclass of StreamAudioSource to output a stream of encoded bytes.

jemisgoti commented 2 years ago

Basically i just wanted similar functionality to https://flutter-sound.canardoux.xyz/tau_api_player_start_player_from_stream.html using StreamAudioSource. is that possible?

ryanheise commented 2 years ago

I explained in my previous answer. raw PCM audio is not supported by this plugin, you need to encode it first. If you can encode it first, then you can do what you're trying to do.

So, you'll need to find another plugin to encode your raw PCM audio into one of the standard audio encodings that the platform will understand, such as WAV. Then you can stream the WAV data from just_audio's StreamAudioSource.

I have not researched the other plugins to see if there is actually an existing plugin that can do encoding. Maybe the ffmpeg plugin can do it, although its licence might not be suitable for an app. There's also various microphone recorder plugins that contain code for encoding raw PCM data into WAV but someone would need to extract that code into a separate plugin. Maybe there already is a plugin that does that, but you'll need to research it since just_audio is only concerned with the decoding side of things.

artyomkonyaev commented 9 months ago

Hello @jemisgoti ,

I am currently facing a problem that is very similar to what you described here. Have you been able to solve this problem?

I would grateful if you could share some details about your current approach, possibly with code snippets. Even if you are utilizing something else than just_audio (perhaps such knowledge could later help improve this package's capabilities).

Thank you.

(Edited by ryanheise: This has been marked off topic because you are cross posting the same question across multiple unrelated issues. There should be a single issue per... issue.)