Canardoux / flutter_sound

Flutter plugin for sound. Audio recorder and player.
Mozilla Public License 2.0
877 stars 573 forks source link

[enh] PCM stream should convert to List<int> #596

Open GGLabCenter opened 3 years ago

GGLabCenter commented 3 years ago

Hi,

I am trying to stream in near-realtime from the recorder (mic) to the player(headsets) - like many people, it seems :-). I started from the streamLoop example, easy and cool.

My next-step: to add the processing in the middle. I take the uint8list from buffer.data and after converting to int16list I do the processing on it and back. I tried to use Codec.pcm16WAV, reason: when using Codec.pcm16, the processing itself in the middle works fine but in the headsets I got only a terrible ?white? noise mixed with silence. I really have no idea about the reasons..

I know the processing step itself is ok, and I think that it should work fine with pcm16WAV. I don't know why with Codec.pcm16 I get this very -strange- behavior.. I did a test with a classic wav file recorded from the phone but not with this library, and processing+playing worked without issues.. but I would need to do the same in real-time with the stream.

I attach a mediainfo report of the file that worked with (played correctly from assets, with the processing in the middle):


Format                                   : Wave
File size                                : 175 KiB
Duration                                 : 4 s 63 ms
Overall bit rate mode                    : Constant
Overall bit rate                         : 353 kb/s
Format                                   : PCM
Format settings                          : Little / Signed
Codec ID                                 : 1
Duration                                 : 4 s 63 ms
Bit rate mode                            : Constant
Bit rate                                 : 352.8 kb/s
Channel(s)                               : 1 channel
Sampling rate                            : 22.05 kHz
Bit depth                                : 16 bits
Stream size                              : 175 KiB (100%)

I opened this file with basic code+minor mods of liveplayback.dart, with codec.pcm16(!) set to the player. This last result confused me.

After a bunch of shady errors I found that the record-to-stream feature is available only for pcm16 (nice to have: a 2words-note about that in the docs in that section, didn't find any!). I don't know why the app did not throw the exception in the flutter_sound_recorder.dart (245):

    if (toStream != null && codec != Codec.pcm16)
      throw Exception ('toStream can only be used with codec == Codec.pcm16');

Error I got instead (partial):

E/flutter (32427): [ERROR:flutter/lib/ui/ui_dart_state.cc(177)] Unhandled Exception: PlatformException(ERR_UNKNOWN, ERR_UNKNOWN, startPlayer() error, null) E/flutter (32427): #0 StandardMethodCodec.decodeEnvelope package:flutter/…/services/message_codecs.dart:582 E/flutter (32427): #1 MethodChannel._invokeMethod package:flutter/…/services/platform_channel.dart:159 E/flutter (.....

Why there is such 'limit' on the codec when using record-to-stream? Do you think there's any way to fix/improve that by library's side? Do you have any suggestion for this use case?

Thanks

Larpoux commented 3 years ago

Do you use Android or iOS ?

Record to Stream and Playback from Stream cannot work with a Wave format. Wave is a FILE format with a header. It cannot be used with a STREAM. The header cannot be updated before the end of the record. The file is considered as _corrupted__ until this header is updated.

You must isolate your problem :

Be careful with your processing, particularly the Endian-ness. I think to remember that τ is actually only "Little Endian" (but my memory is not very good and I can be wrong). The data are INT16, signed. And only one channel (monophony). Flutter Sound will accept other formats later. I hope.

GGLabCenter commented 3 years ago

I am working on Android actually, but I would like to be open to iOS implementation eventually - that's the flutter's main claim, after all!

Thanks for the endian-ness tip: this was already a thing I wanted to check because my suspect is that some conversion is missed there.

The problem seems not to be in the processing step: I tried to process a simple wav file (recorded with another app, mediainfo report above) and play in the player, and works fine as expected after the processing. Also, I used the processing code externally to Flutter and works fine.

I'll do some test for the little/big endian conversion, thanks!

GGLabCenter commented 3 years ago

Maybe, do you have some code-sample to achieve the endianless conversion big-little and viceversa for lists? it is not clear to me how to do the conversion with my lists.

In the meantime.. I paste here a sample that let you reproduce easily (I used a class from the sample!). As you can see in the comments inside the code, I commented the processing row and noticed that the problem remains


/*
 * Copyright 2018, 2019, 2020 Dooboolab.
 *
 * This file is part of Flutter-Sound.
 *
 * Flutter-Sound is free software: you can redistribute it and/or modify
 * it under the terms of the GNU Lesser General Public License version 3 (LGPL-V3), as published by
 * the Free Software Foundation.
 *
 * Flutter-Sound is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 * GNU General Public License for more details.
 *
 * You should have received a copy of the GNU Lesser General Public License
 * along with Flutter-Sound.  If not, see <https://www.gnu.org/licenses/>.
 */

import 'dart:async';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter_sound/flutter_sound.dart';
import 'package:path_provider/path_provider.dart';
import 'package:permission_handler/permission_handler.dart';
/*
 * This is an example showing how to record to a Dart Stream.
 * It writes all the recorded data from a Stream to a File, which is completely stupid:
 * if an App wants to record something to a File, it must not use Streams.
 *
 * The real interest of recording to a Stream is for example to feed a
 * Speech-to-Text engine, or for processing the Live data in Dart in real time.
 *
 */

///
const int tSampleRate = 44100;
typedef _Fn = void Function();

/// Example app.
class RecordToStreamExample extends StatefulWidget {
  @override
  _RecordToStreamExampleState createState() => _RecordToStreamExampleState();
}

class _RecordToStreamExampleState extends State<RecordToStreamExample> {
  FlutterSoundPlayer _mPlayer = FlutterSoundPlayer();
  FlutterSoundRecorder _mRecorder = FlutterSoundRecorder();
  bool _mPlayerIsInited = false;
  bool _mRecorderIsInited = false;
  bool _mplaybackReady = false;
  String _mPath;
  StreamSubscription _mRecordingDataSubscription;

  Future<void> _openRecorder() async {
    var status = await Permission.microphone.request();
    if (status != PermissionStatus.granted) {
      throw RecordingPermissionException('Microphone permission not granted');
    }
    await _mRecorder.openAudioSession();
    setState(() {
      _mRecorderIsInited = true;
    });
  }

  @override
  void initState() {
    super.initState();
    // Be careful : openAudioSession return a Future.
    // Do not access your FlutterSoundPlayer or FlutterSoundRecorder before the completion of the Future
    _mPlayer.openAudioSession().then((value) {
      setState(() {
        _mPlayerIsInited = true;
      });
    });
    _openRecorder();
  }

  @override
  void dispose() {
    stopPlayer();
    _mPlayer.closeAudioSession();
    _mPlayer = null;

    stopRecorder();
    _mRecorder.closeAudioSession();
    _mRecorder = null;
    super.dispose();
  }

  Future<IOSink> createFile() async {
    var tempDir = await getExternalStorageDirectory();
    _mPath = '${tempDir.path}/flutter_sound_example3.pcm';
    var outputFile = File(_mPath);
    if (outputFile.existsSync()) {
      await outputFile.delete();
    }
    return outputFile.openWrite();
  }

  // ----------------------  Here is the code to record to a Stream ------------

  Future<void> record() async {
    play();

    assert(_mRecorderIsInited && _mPlayer.isStopped);
    //var sink = await createFile();
    var recordingDataController = StreamController<Food>();
    _mRecordingDataSubscription =
        recordingDataController.stream.listen((buffer) {
      if (buffer is FoodData) {
        // sink.add(buffer.data); no interest to save into a file + then stream
        // would be already occupied by a listener (and so it gives error).

        /*------------------ my tests: ----------------------*/

        /* (1) works fine, feeding to the player direcly the input without any 
           change, as in the sample, gives audio in headsets correctly: */
        // feedHim(buffer.data);

        /* (2) doesn't work, setting a new Uint8List from the extracted 
           array gives noise+silence: */
        input_array = buffer.data.buffer.asInt16List();
        // [+] eventual processing goes here using the Int16List input_array..
        //      and gives back an eventual Int16List output_array. For now,
        //      in order to test I just gives back the input_array.
        /* two options: copy the array or use it directly (but same result): */
        // FoodData b = new FoodData(input_array.buffer.asUint8List());
        FoodData b = new FoodData(Uint8List.fromList([...input_array]));
        feedHim(b.data);

        /*------------------ --------- ----------------------*/

        // buffer.data = input_array.buffer.asUint8List();
        // feedHim(buffer.data);

        //_mPlayer.foodStreamController.add(b); //buffer); //
      }
    });
    await _mRecorder.startRecorder(
      toStream: recordingDataController.sink,
      codec: Codec.pcm16,
      numChannels: 1,
      sampleRate: tSampleRate,
    );
    setState(() {});
  }
  // --------------------- (it was very simple, wasn't it ?) -------------------

  var blockSize = 4096;
  Future<void> feedHim(Uint8List buffer) async {
    var lnData = 0;
    var totalLength = buffer.length;
    while (totalLength > 0) {
      var bsize = totalLength > blockSize ? blockSize : totalLength;
      await _mPlayer
          .feedFromStream(buffer.sublist(lnData, lnData + bsize)); // await !!!!
      lnData += bsize;
      totalLength -= bsize;
    }
  }

  Future<void> stopRecorder() async {
    await _mRecorder.stopRecorder();
    if (_mRecordingDataSubscription != null) {
      await _mRecordingDataSubscription.cancel();
      _mRecordingDataSubscription = null;
    }
    _mplaybackReady = true;
  }

  _Fn getRecorderFn() {
    if (!_mRecorderIsInited || !_mPlayer.isStopped) {
      return null;
    }
    return _mRecorder.isStopped
        ? record
        : () {
            stopRecorder().then((value) => setState(() {}));
          };
  }

  void play() async {
    /*assert(_mPlayerIsInited &&
        _mplaybackReady &&
        _mRecorder.isStopped &&
        _mPlayer.isStopped);*/
    /*await _mPlayer.startPlayer(
        fromURI: _mPath,
        sampleRate: tSampleRate,
        codec: Codec.pcm16,
        numChannels: 1,*/
    await _mPlayer.startPlayerFromStream(
        codec: Codec.pcm16, numChannels: 1, sampleRate: tSampleRate);
    //whenFinished: () {
    setState(() {});
    //  }); // The readability of Dart is very special :-(
    // setState(() {});
  }

  Future<void> stopPlayer() async {
    await _mPlayer.stopPlayer();
  }

  _Fn getPlaybackFn() {
    if (!_mPlayerIsInited || !_mplaybackReady || !_mRecorder.isStopped) {
      return null;
    }
    return _mPlayer.isStopped
        ? play
        : () {
            stopPlayer().then((value) => setState(() {}));
          };
  }

  // ----------------------------------------------------------------------------------------------------------------------

  @override
  Widget build(BuildContext context) {
    Widget makeBody() {
      return Column(
        children: [
          Container(
            margin: const EdgeInsets.all(3),
            padding: const EdgeInsets.all(3),
            height: 80,
            width: double.infinity,
            alignment: Alignment.center,
            decoration: BoxDecoration(
              color: Color(0xFFFAF0E6),
              border: Border.all(
                color: Colors.indigo,
                width: 3,
              ),
            ),
            child: Row(children: [
              RaisedButton(
                onPressed: getRecorderFn(),
                color: Colors.white,
                disabledColor: Colors.grey,
                child: Text(_mRecorder.isRecording ? 'Stop' : 'Record'),
              ),
              SizedBox(
                width: 20,
              ),
              Text(_mRecorder.isRecording
                  ? 'Recording in progress'
                  : 'Recorder is stopped'),
            ]),
          ),
          Container(
            margin: const EdgeInsets.all(3),
            padding: const EdgeInsets.all(3),
            height: 80,
            width: double.infinity,
            alignment: Alignment.center,
            decoration: BoxDecoration(
              color: Color(0xFFFAF0E6),
              border: Border.all(
                color: Colors.indigo,
                width: 3,
              ),
            ),
            child: Row(children: [
              RaisedButton(
                onPressed: getPlaybackFn(),
                color: Colors.white,
                disabledColor: Colors.grey,
                child: Text(_mPlayer.isPlaying ? 'Stop' : 'Play'),
              ),
              SizedBox(
                width: 20,
              ),
              Text(_mPlayer.isPlaying
                  ? 'Playback in progress'
                  : 'Player is stopped'),
            ]),
          ),
        ],
      );
    }

    return Scaffold(
      backgroundColor: Colors.blue,
      appBar: AppBar(
        title: const Text('Record to Stream ex.'),
      ),
      body: makeBody(),
    );
  }
}

It could not be the best code in order to achieve that functionality, but my current priority is a working draft/prototype..

Larpoux commented 3 years ago

I think that :

input_array = buffer.data.buffer.asInt16List();

is not correct. You should try something like :

input_array =  List<int>(buffer.data.buffer.length/2) ;
for (int i = 0; i < buffer.data.buffer.length/2; ++i)
{
         input_array[i] = buffer.data.buffer[2*i] << 8 +  buffer.data.buffer[2*i+1];
         // Or maybe : input_array[i] = buffer.data.buffer[2*i+1] << 8 +  buffer.data.buffer[2*i]; // I am not sure
}

This code, is just an idea. There are certainly several syntax errors that you will fix.

And after processing, do the opposite to create the UInt8List that you give to the output sink.

I agree : this code is ugly should be done inside τ and not in your App. I am going to put a new task inside the Kanban Board (roadmap).

GGLabCenter commented 3 years ago

Ok, I am quite sure the code below has problems and is not correct, but it's what I was able to do until now: conversion to list_int16 and back to uint8list. I see the foodata.data is an ArrayView object somehow related to the buffer's uint8list.. I think I have to wait the 'official' improvement from this issue. Btw I share the code here, if could be eventually useful (I guess how.. hah):

/*
 * Copyright 2018, 2019, 2020 Dooboolab.
 *
 * This file is part of Flutter-Sound.
 *
 * Flutter-Sound is free software: you can redistribute it and/or modify
 * it under the terms of the GNU Lesser General Public License version 3 (LGPL-V3), as published by
 * the Free Software Foundation.
 *
 * Flutter-Sound is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 * GNU General Public License for more details.
 *
 * You should have received a copy of the GNU Lesser General Public License
 * along with Flutter-Sound.  If not, see <https://www.gnu.org/licenses/>.
 */

import 'dart:async';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter_sound/flutter_sound.dart';
import 'package:path_provider/path_provider.dart';
import 'package:permission_handler/permission_handler.dart';

/*
 * This is an example showing how to record to a Dart Stream.
 * It writes all the recorded data from a Stream to a File, which is completely stupid:
 * if an App wants to record something to a File, it must not use Streams.
 *
 * The real interest of recording to a Stream is for example to feed a
 * Speech-to-Text engine, or for processing the Live data in Dart in real time.
 *
 */

const int tSampleRate = 44100;
typedef _Fn = void Function();

/// Example app.
class RecordToStreamExample extends StatefulWidget {
  @override
  _RecordToStreamExampleState createState() => _RecordToStreamExampleState();
}

class _RecordToStreamExampleState extends State<RecordToStreamExample> {
  FlutterSoundPlayer _mPlayer = FlutterSoundPlayer();
  FlutterSoundRecorder _mRecorder = FlutterSoundRecorder();
  bool _mPlayerIsInited = false;
  bool _mRecorderIsInited = false;
  bool _mplaybackReady = false;
  String _mPath;
  StreamSubscription _mRecordingDataSubscription;

  Future<void> _openRecorder() async {
    var status = await Permission.microphone.request();
    if (status != PermissionStatus.granted) {
      throw RecordingPermissionException('Microphone permission not granted');
    }
    await _mRecorder.openAudioSession();
    setState(() {
      _mRecorderIsInited = true;
    });
  }

  @override
  void initState() {
    super.initState();
    // Be careful : openAudioSession return a Future.
    // Do not access your FlutterSoundPlayer or FlutterSoundRecorder before the completion of the Future
    _mPlayer.openAudioSession().then((value) {
      setState(() {
        _mPlayerIsInited = true;
      });
    });
    _openRecorder();
  }

  @override
  void dispose() {
    stopPlayer();
    _mPlayer.closeAudioSession();
    _mPlayer = null;

    stopRecorder();
    _mRecorder.closeAudioSession();
    _mRecorder = null;
    super.dispose();
  }

/*
  Future<IOSink> createFile() async {
    var tempDir = await getExternalStorageDirectory();
    _mPath = '${tempDir.path}/flutter_sound_example.pcm';
    var outputFile = File(_mPath);
    if (outputFile.existsSync()) {
      await outputFile.delete();
    }
    return outputFile.openWrite();
  }
*/
  // ----------------------  Here is the code to record to a Stream ------------

  List<int> input_array;
  Uint8List output_array;
  Future<void> record() async {
    play();

    assert(_mRecorderIsInited && _mPlayer.isStopped);
    //var sink = await createFile();
    var recordingDataController = StreamController<Food>();
    _mRecordingDataSubscription =
        recordingDataController.stream.listen((buffer) {
      if (buffer is FoodData) {
        //sink.add(buffer.data);
        Endian endianess = Endian.little;
        input_array = List<int>(buffer.data.length ~/ 2);
        var temp = buffer.data.buffer.asByteData();
        for (int i = 0; i < buffer.data.length ~/ 2; ++i) {
          input_array[i] = temp.getInt16(i * 2, endianess);
        }

        Uint8List uint8list = new Uint8List(input_array.length * 2);

        for (int i = 0; i < input_array.length; ++i) {
          var b = ByteData(2);
          b.setInt16(0, input_array[i], endianess);
          uint8list.buffer.asByteData().setUint8(i * 2, b.getUint8(0));
          uint8list.buffer.asByteData().setUint8(i * 2 + 1, b.getUint8(1));
        }

        buffer.data.buffer.asUint8List().setAll(0, uint8list);

        feedHim(buffer
            .data);
      }
    });
    await _mRecorder.startRecorder(
      toStream: recordingDataController.sink,
      codec: Codec.pcm16,
      numChannels: 1,
      sampleRate: tSampleRate,
    );
    setState(() {});
  }
  // --------------------- (it was very simple, wasn't it ?) -------------------

  var blockSize = 4096;
  Future<void> feedHim(Uint8List buffer) async {
    var lnData = 0;
    var totalLength = buffer.length;
    while (totalLength > 0) {
      var bsize = totalLength > blockSize ? blockSize : totalLength;
      /*var aa = buffer.sublist(lnData, lnData + bsize);
      var bb = Uint8List(0).toList();
      for (int x = 0; x < aa.lengthInBytes - 1; x + 2) {
        bb.add(aa.buffer.asByteData().getInt16(x, Endian.big));
      }*/
      await _mPlayer
          .feedFromStream(buffer.sublist(lnData, lnData + bsize)); // await !!!!
      lnData += bsize;
      totalLength -= bsize;
    }
  }

  Future<void> stopRecorder() async {
    await _mRecorder.stopRecorder();
    if (_mRecordingDataSubscription != null) {
      await _mRecordingDataSubscription.cancel();
      _mRecordingDataSubscription = null;
    }
    _mplaybackReady = true;
  }

  _Fn getRecorderFn() {
    if (!_mRecorderIsInited || !_mPlayer.isStopped) {
      return null;
    }
    return _mRecorder.isStopped
        ? record
        : () {
            stopRecorder().then((value) => setState(() {}));
          };
  }

  void play() async {
    /*assert(_mPlayerIsInited &&
        _mplaybackReady &&
        _mRecorder.isStopped &&
        _mPlayer.isStopped);*/
    /*await _mPlayer.startPlayer(
        fromURI: _mPath,
        sampleRate: tSampleRate,
        codec: Codec.pcm16,
        numChannels: 1,*/
    await _mPlayer.startPlayerFromStream(
        codec: Codec.pcm16, numChannels: 1, sampleRate: 44000);
    //whenFinished: () {
    setState(() {});
    //  }); // The readability of Dart is very special :-(
    // setState(() {});
  }

  Future<void> stopPlayer() async {
    await _mPlayer.stopPlayer();
  }

  _Fn getPlaybackFn() {
    if (!_mPlayerIsInited || !_mplaybackReady || !_mRecorder.isStopped) {
      return null;
    }
    return _mPlayer.isStopped
        ? play
        : () {
            stopPlayer().then((value) => setState(() {}));
          };
  }

  // ----------------------------------------------------------------------------------------------------------------------

  @override
  Widget build(BuildContext context) {
    Widget makeBody() {
      return Column(
        children: [
          Container(
            margin: const EdgeInsets.all(3),
            padding: const EdgeInsets.all(3),
            height: 80,
            width: double.infinity,
            alignment: Alignment.center,
            decoration: BoxDecoration(
              color: Color(0xFFFAF0E6),
              border: Border.all(
                color: Colors.indigo,
                width: 3,
              ),
            ),
            child: Row(children: [
              RaisedButton(
                onPressed: getRecorderFn(),
                color: Colors.white,
                disabledColor: Colors.grey,
                child: Text(_mRecorder.isRecording ? 'Stop' : 'Record'),
              ),
              SizedBox(
                width: 20,
              ),
              Text(_mRecorder.isRecording
                  ? 'Recording in progress'
                  : 'Recorder is stopped'),
            ]),
          ),
          Container(
            margin: const EdgeInsets.all(3),
            padding: const EdgeInsets.all(3),
            height: 80,
            width: double.infinity,
            alignment: Alignment.center,
            decoration: BoxDecoration(
              color: Color(0xFFFAF0E6),
              border: Border.all(
                color: Colors.indigo,
                width: 3,
              ),
            ),
            child: Row(children: [
              RaisedButton(
                onPressed: getPlaybackFn(),
                color: Colors.white,
                disabledColor: Colors.grey,
                child: Text(_mPlayer.isPlaying ? 'Stop' : 'Play'),
              ),
              SizedBox(
                width: 20,
              ),
              Text(_mPlayer.isPlaying
                  ? 'Playback in progress'
                  : 'Player is stopped'),
            ]),
          ),
        ],
      );
    }

    return Scaffold(
      backgroundColor: Colors.blue,
      appBar: AppBar(
        title: const Text('Record to Stream ex.'),
      ),
      body: makeBody(),
    );
  }
}
ebelevics commented 3 years ago

I'm also trying understand why as a stream we do receive Uint8List instead of Uint16List as we are using pcm16 with 16bit depth. My first thought was maybe Uint16 is split into two Uint8, but it was not the case with combining two Uint8 values as recorded audio on playback got two times faster.

invisible-defects commented 3 years ago

@ebelevics any updates on that? still scratching my head

Larpoux commented 3 years ago

Yes, I agree with you : StartRecorder should return :

I am currently working on an upgrade of the Flutter Sound API. It will be Flutter Sound Version 9.0.x

With this new version, I want to implement various PCM format (mono/stereo, 8/16 bits width, little/big endian, integer/float). And the App will receive a correct List < something >

I cannot say when this new API will be released. Probably I will release a beta version during september.

Larpoux commented 3 years ago

There is also some developers who fighted with Base64 encoding. If you have some suggestion about this need, please tell me. Perhaps just an utility inside Flutter Sound to convert to/from base64

Larpoux commented 3 years ago

Another thing that I would like to implement : RecordToStream and PlaybackFromStream on Flutter web. I am pretty sure that it is possible to do that on Javascript.

But unfortunately I think that it will not be possible to do that on 9.0 , because it will add other delay to the release of 9.0

invisible-defects commented 3 years ago

Thanks for the updates @Larpoux! I'm new to dart and wasn't familiar with byte operations in dart, but I've seemed to figure everything out, thanks to the code provided by @GGLabCenter

camphan12993 commented 2 years ago

any update?

Goallying commented 1 year ago

I start recorder with "Codec.pcm16" , and the callback buffer.data.length = 12800, what happended?

My code: await _recorder.startRecorder( toStream: _recordingDataController!.sink, codec: Codec.pcm16, numChannels: 1, sampleRate: 16000, );

github-actions[bot] commented 11 months ago

This issue is stale because it has been open 90 days with no activity. Leave a comment or this will be closed in 7 days.