SimformSolutionsPvtLtd / audio_waveforms

Use this plugin to generate waveforms while recording audio in any file formats supported by given encoders or from audio files. We can use gestures to scroll through the waveforms or seek to any position while playing audio and also style waveforms
https://pub.dev/packages/audio_waveforms
MIT License
245 stars 125 forks source link

Issues in wave form visualizing and play pause Seek #284

Closed MoazzamAliSE closed 1 month ago

MoazzamAliSE commented 2 months ago

https://github.com/SimformSolutionsPvtLtd/audio_waveforms/assets/102206806/9025e922-14f0-4d50-9e02-f593050a401c

Hellow, Hope this message find you well I appreciate your efforts towards this very amazing package.. but i encountered with some issues.

  1. the wave form is completed the blue part in the video but the audio is still playing because the speed or let say the active wave is completed but the audio is still remaining which is playing. it visually indicating that the audio is completed but by hearing or by seeing the counter on left side it is analysing that the audio is still remaining i have tried many things like set the width, clip behaviour and many more things. please check this out ..

  2. audio is seeking when it is playing but it is more sensative. and when it is paused or not playing it is not seeking at any where. when i use the WaveformType.long then the inactive wave is seeking but there is no property to get the seek value so i can set it explicitly. when the inactive wave is seeking , then i played the audio , it plays the audio where i paused it which is opposite of my need. thank you ..

Ujas-Majithiya commented 2 months ago

@MoazzamAliSE Thank you for creating the issue. Can please share your implementation of controller and widget so that we can debug this>

MoazzamAliSE commented 2 months ago

ok Sure

MoazzamAliSE commented 2 months ago

//This is AudioWave widget

import 'package:audio_waveforms/audio_waveforms.dart'; import 'package:flutter/material.dart';

class AudioPlayingWaveForm extends StatelessWidget { const AudioPlayingWaveForm({ super.key, required this.playerController, required this.width, required this.height, required this.waveformType, }); final double width; final double height;

final PlayerController playerController; final WaveformType waveformType; @override Widget build(BuildContext context) { return Container( decoration: BoxDecoration( color: Colors.grey[100], borderRadius: const BorderRadius.all( Radius.circular(12), ), ), child: AudioFileWaveforms( size: Size(width, height), playerController: playerController,

    enableSeekGesture:
        true, 

    waveformType: waveformType,
    padding: const EdgeInsets.all(8),
    playerWaveStyle: PlayerWaveStyle(

      seekLineColor: Colors.grey.shade300,
      scaleFactor: 300,
      waveThickness: 3,
      fixedWaveColor: const Color(0xffc4c4c4),
      liveWaveColor: cPrimaryDarkColor,
      waveCap: StrokeCap.round,
    ),
  ),
);

} }

// This is the use of the widget

Row( children: [ Expanded( child: AudioPlayingWaveForm( waveformType: WaveformType.fitWidth, width: Get.width * 0.9, height: 120, playerController: controller.playerController, ), ), const SizedBox( width: 10, ), controller.isLoading ? const Center( child: CircularProgressIndicator(), ) : GestureDetector( onTap: () { controller.startPlayingRecording(); }, child: SvgPicture.asset( controller.playerController .playerState == PlayerState.playing ? AssetRef.pausedButtonIcon : AssetRef.playCircle, height: 57.0, width: 57, ), ) ], )

// Controller

import 'dart:async'; import 'dart:io';

import 'package:audio_waveforms/audio_waveforms.dart'; import 'package:flutter/foundation.dart'; import 'package:get/get.dart';

class PastVoiceNoteAndTranscribeController extends GetxController { final _playerController = PlayerController(); String _localFilePath = ''; bool _isLoading = false; PlayerController get playerController => _playerController; String get localFilePath => _localFilePath; bool get isLoading => _isLoading; setLoading(bool loading) { _isLoading = loading; update(); }

// bool _isPlaying = false; // bool get isPlaying => _isPlaying; // void setPlaying(bool value) { // _isPlaying = value; // update(); // }

int _playerDuration = 0;

Timer? _timer; Timer? get timer => _timer;

int get playerDuration => _playerDuration; Future preparePlayerController( String url, ) async { _localFilePath = url; setLoading(true); try { await _playerController .preparePlayer(path: _localFilePath) .whenComplete(() => setLoading(false)); update(); } catch (e) { if (kDebugMode) { print('Error preparing player controller: $e'); } setLoading(false); } }

void _startTimerReverse(bool isPaused) { if (!isPaused) { _timer = Timer.periodic(const Duration(milliseconds: 100), (Timer t) async { var currentDuration = await _playerController.getDuration(DurationType.current); var maxDuration = await _playerController.getDuration(DurationType.max); maxDuration = maxDuration ~/ 1000;

    currentDuration = currentDuration ~/ 1000;
    _playerDuration = currentDuration;
    update();
    if (maxDuration == currentDuration) {
      t.cancel();
      _timer?.cancel();
      // completeAudioPlayed();
      update();
    }
    update();
  });
} else {
  _timer?.cancel();
}

}

// void completeAudioPlayed() { // setPlaying(false); // }

void pauseAudio() async { if (_playerController.playerState == PlayerState.playing) { _startTimerReverse(true); // setPlaying(false); await _playerController.pausePlayer(); } }

Future startPlayingRecording() async { if (_playerController.playerState.isPlaying) { pauseAudio(); } else if (_playerController.playerState.isPaused) { _startTimerReverse(false); await _playerController .startPlayer(finishMode: FinishMode.pause) .whenComplete( () { // setPlaying(false); }, ); // setPlaying(true); return; } else { try { if (_playerController.playerState.isStopped) { // setPlaying(false); print("no recording"); return; } if (_localFilePath.isNotEmpty) { final file = File(_localFilePath); if (await file.exists()) { await _playerController.startPlayer( finishMode: FinishMode.pause, ); // setPlaying(true); int checkDuration = await _playerController.getDuration(); if (checkDuration != -1 && checkDuration >= 0) { _startTimerReverse(false); } update(); return; } } } catch (e) { if (kDebugMode) { print(e); } } } }

@override void dispose() async { _timer?.cancel();

_playerController.dispose();

super.dispose();

} }

MoazzamAliSE commented 2 months ago

Sir i know that i have use the expaned widget , also i use the width .. but this is causing issue in another screen also ..

and one more issue is that when i come back from the screen the audio is not stop automatically even if the dispose is triggered. i have to manually stop the audio by using the PopScope.

Ujas-Majithiya commented 2 months ago

@MoazzamAliSE As mentioned in the documentation, if you are using WaveformType.fitWidth, you will have to provide number of samples while preparing a player controller.preparePlayer(noOfSamples: ...).

To get samples according to your custom width, you can use getSamplesForWidth function available in the PlayerWaveStyle.

controller.preparePlayer(
      noOfSamples: playerWaveStyle.getSamplesForWidth(width),
      ...
    );

Just make sure to the same instance as you used for the waveform widget.

Ujas-Majithiya commented 2 months ago

Now, to the other question, I tested with disposing the controller before navigating to another screen like this,

           onPressed: () {
                  controller.dispose();
                  Navigator.push(context, MaterialPageRoute(
                    builder: (context) {
                      return Container();
                    },
                  ));
                },

And it does stop the player for me. Can please share more info on this?

Ujas-Majithiya commented 1 month ago

Closing this issue since no response was provided.