SimformSolutionsPvtLtd / react-native-audio-waveform

React Native component to show audio waveform with ease in react native application ✨
MIT License
96 stars 7 forks source link

BUG] Issues with Duration Count and Waveform Load Times on Android and iOS #57

Open MwSpaceLLC opened 1 month ago

MwSpaceLLC commented 1 month ago

Hello,

First of all, thank you for the fantastic work on this module. We have successfully integrated it with Expo, and it works perfectly. However, we have encountered a few issues and have some suggestions for improvement.

[BUG] ON ANDROID | (dev & prod build)

Issue with Duration Count: When setting the default recording quality and extension to .m4a, the duration count loads the first time but then stops updating and remains at 0. This occurs, for example, when using router navigation to open a chat and then navigating back.

Here are the default recording settings for Expo AV audio that we pass to the waveform:

RecordingOptionsPresets.HIGH_QUALITY = {
  isMeteringEnabled: true,
  android: {
    extension: '.m4a',
    outputFormat: AndroidOutputFormat.MPEG_4,
    audioEncoder: AndroidAudioEncoder.AAC,
    sampleRate: 44100,
    numberOfChannels: 2,
    bitRate: 128000,
  },
  ios: {
    extension: '.m4a',
    outputFormat: IOSOutputFormat.MPEG4AAC,
    audioQuality: IOSAudioQuality.MAX,
    sampleRate: 44100,
    numberOfChannels: 2,
    bitRate: 128000,
    linearPCMBitDepth: 16,
    linearPCMIsBigEndian: false,
    linearPCMIsFloat: false,
  },
  web: {
    mimeType: 'audio/webm',
    bitsPerSecond: 128000,
  },
};

[BUG] ON IOS AND ANDROID | (dev & prod build)

Issue with Waveform Load Times: The waveform spectrum takes about half a second to load on iOS, which disrupts the user experience. On Android, when reloading the chat page (e.g., with a FlatList component), the waveform component loads, but the audio spectrum takes a significant amount of time, about half a second for each message. Thus, if there are 10-20 messages, the spectrum loads asynchronously, appearing visually buggy to the user.

SUGGESTION

It would be very helpful if you could implement a caching mechanism or similar functionality for the spectrum. This way, when the component loads, it does not need to re-render the spectrum by analyzing the audio track every time, but instead, it can load the pre-rendered spectrum. This would significantly optimize the load times of the audio spectrum.

For example, adding a function to the component like the one below could allow the waveform to be generated once and then load from the store, preventing re-render delays:

<Waveform initialWaveData={localData} onWaveLoad={(data) => functionForStoreLocallySpectrumData(data)}/>

[BUG VIDEO REGISTRATION ANDROID 14]

https://github.com/user-attachments/assets/5d4371ae-db20-4f36-aec3-1c4afc905a03

As MwSpace, we are very interested in this implementation to utilize this component in our production app.

If needed, we are also open to commercial collaboration.

Thank you, and we look forward to hearing from you.

Best regards, Alex CEO of MwSpace LLC

MwSpaceLLC commented 1 month ago
src/components/Waveform/Waveform.tsx

at component, so when data bee extracted, u can fire event like follow example:

as props in add: initialWaveData, onWaveLoad

  const getAudioWaveFormForPath = async (noOfSample: number) => {
    if (!isNil(path) && !isEmpty(path)) {
      try {
        (onChangeWaveformLoadState as Function)?.(true);

        // [MWSPACE] initial props to load stored wavedata:
        const result = initialWaveData || await extractWaveformData({
          path: path,
          playerKey: `PlayerFor${path}`,
          noOfSamples: noOfSample,
        });
        (onChangeWaveformLoadState as Function)?.(false);

        if (!isNil(result) && !isEmpty(result)) {
          const waveforms = head(result);
          if (!isNil(waveforms) && !isEmpty(waveforms)) {

            setWaveform(waveforms);

            // [MWSPACE] callback function to store wavedata:
            onWaveLoad(waveforms)

            await preparePlayerAndGetDuration();
          }
        }
      } catch (err) {
        (onError as Function)(err);
        (onChangeWaveformLoadState as Function)?.(false);
        console.error(err);
      }
    } else {
      (onError as Function)(
        `Can not find waveform for mode ${mode} path: ${path}`
      );
      console.error(`Can not find waveform for mode ${mode} path: ${path}`);
    }
  };

Thank you, and we look forward to hearing from you. Best regards, Alex CEO of MwSpace LLC

MwSpaceLLC commented 1 month ago

so for update.

I have write a custom component for test stored data, but a bug not solved.

I think the problem is Candle Render Method.

Can i suggest, try to create a static image of wave, no render single line.

import {clamp, floor, head, isEmpty, isNil} from 'lodash';

import React, {
    forwardRef,
    useEffect,
    useImperativeHandle,
    useRef,
    useState,
} from 'react';

import {
    PanResponder,
    ScrollView,
    View,
} from 'react-native';

import {
    useAudioPermission,
    useAudioPlayer,
    useAudioRecorder,
} from '@simform_solutions/react-native-audio-waveform/lib/hooks';

// import {
//     WaveformCandle
// } from '@simform_solutions/react-native-audio-waveform/lib/components/WaveformCandle/WaveformCandle';

import {
    PlayerState,
    RecorderState,
    FinishMode,
    PermissionStatus,
    DurationType,
    UpdateFrequency,
} from '@simform_solutions/react-native-audio-waveform/lib/constants';
import CustomWaveFromCandle from "@/components/Custom/CustomWaveFromCandle";

/**
 *
 * @type {React.ForwardRefExoticComponent<React.PropsWithoutRef<{}> & React.RefAttributes<unknown>>}
 */
const CustomWaveform = forwardRef((props, ref) => {

    const {

        mode,

        path,

        candleSpace = 2,
        candleWidth = 5,
        containerStyle = {},
        waveColor,
        scrubColor,
        onPlayerStateChange,
        onRecorderStateChange,

        // initial data | prevent extractor
        initialWaveformData = () => null,

        // pass data load waveform event
        onWaveformDataLoad = () => null,

        // bug fix not a function type
        onPanStateChange = () => null,

        onError,

        onCurrentProgressChange,

        candleHeightScale = 3,

        onChangeWaveformLoadState,

    } = props;

    const viewRef = useRef(null);
    const scrollRef = useRef(null);
    const [waveform, setWaveform] = useState([]);
    const [viewLayout, setViewLayout] = useState(null);
    const [seekPosition, setSeekPosition] = useState(null);
    const [songDuration, setSongDuration] = useState(0);
    const [noOfSamples, setNoOfSamples] = useState(0);
    const [currentProgress, setCurrentProgress] = useState(0);
    const [panMoving, setPanMoving] = useState(false);
    const [playerState, setPlayerState] = useState(PlayerState.stopped);
    const [recorderState, setRecorderState] = useState(RecorderState.stopped);

    const {
        extractWaveformData,
        preparePlayer,
        getDuration,
        seekToPlayer,
        playPlayer,
        stopPlayer,
        pausePlayer,
        onCurrentDuration,
        onDidFinishPlayingAudio,
        onCurrentRecordingWaveformData,
    } = useAudioPlayer();

    const {startRecording, stopRecording, pauseRecording, resumeRecording} = useAudioRecorder();

    const {checkHasAudioRecorderPermission} = useAudioPermission();

    const preparePlayerForPath = async () => {
        if (!isNil(path) && !isEmpty(path)) {
            try {
                const prepare = await preparePlayer({
                    path,
                    playerKey: `PlayerFor${path}`,
                    updateFrequency: UpdateFrequency.medium,
                    volume: 10,
                });
                return Promise.resolve(prepare);
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error(`Can not start player for path: ${path}`));
        }
    };

    const getAudioDuration = async () => {
        try {
            const duration = await getDuration({
                playerKey: `PlayerFor${path}`,
                durationType: DurationType.max,
            });

            console.log('duration', duration)

            if (!isNil(duration)) {
                setSongDuration(duration);
            } else {
                return Promise.reject(new Error(`Could not get duration for path: ${path}`));
            }
        } catch (err) {
            return Promise.reject(err);
        }
    };

    const preparePlayerAndGetDuration = async () => {
        try {
            const prepare = await preparePlayerForPath();
            if (prepare) {
                await getAudioDuration();
            }
        } catch (err) {
            console.error(err);
            (onError)(err);
        }
    };

    const getAudioWaveFormForPath = async (noOfSample) => {
        if (!isNil(path) && !isEmpty(path)) {
            try {

                (onChangeWaveformLoadState)?.(true);

                // console.log('initialWaveformData', await initialWaveformData())
                //
                // console.log('extractWaveformData:', await extractWaveformData({
                //     path: path,
                //     playerKey: `PlayerFor${path}`,
                //     noOfSamples: noOfSample,
                // }));

                // [MWSPACE] initial props to load stored wavedata:
                const result = typeof initialWaveformData == 'function' && !!await initialWaveformData() ? await initialWaveformData() : await extractWaveformData({
                    path: path,
                    playerKey: `PlayerFor${path}`,
                    noOfSamples: noOfSample,
                });

                // [MWSPACE] callback function to store wavedata:
                if (
                    typeof onWaveformDataLoad === 'function' &&
                    typeof initialWaveformData === 'function' &&
                    !await initialWaveformData()
                ) {
                    await onWaveformDataLoad(result)
                }

                (onChangeWaveformLoadState)?.(false);

                if (!isNil(result) && !isEmpty(result)) {

                    const waveforms = head(result);

                    if (!isNil(waveforms) && !isEmpty(waveforms)) {

                        setWaveform(waveforms);

                        await preparePlayerAndGetDuration();
                    }
                }
            } catch (err) {
                (onError)(err);
                (onChangeWaveformLoadState)?.(false);
                console.error(err);
            }
        } else {
            (onError)(`Can not find waveform for mode ${mode} path: ${path}`);
            console.error(`Can not find waveform for mode ${mode} path: ${path}`);
        }
    };

    const stopPlayerAction = async () => {
        if (mode === 'static') {
            try {
                const result = await stopPlayer({
                    playerKey: `PlayerFor${path}`,
                });
                await preparePlayerForPath();
                if (!isNil(result) && result) {
                    setCurrentProgress(0);
                    setPlayerState(PlayerState.stopped);
                    return Promise.resolve(result);
                } else {
                    return Promise.reject(new Error(`error in stopping player for path: ${path}`));
                }
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error('error in stopping player: mode is not static'));
        }
    };

    const startPlayerAction = async (args) => {
        if (mode === 'static') {
            try {
                const play = await playPlayer({
                    finishMode: FinishMode.stop,
                    playerKey: `PlayerFor${path}`,
                    path: path,
                    ...args,
                });

                if (play) {
                    setPlayerState(PlayerState.playing);
                    return Promise.resolve(true);
                } else {
                    return Promise.reject(new Error(`error in starting player for path: ${path}`));
                }
            } catch (error) {
                return Promise.reject(error);
            }
        } else {
            return Promise.reject(new Error('error in starting player: mode is not static'));
        }
    };

    const pausePlayerAction = async () => {
        if (mode === 'static') {
            try {
                const pause = await pausePlayer({
                    playerKey: `PlayerFor${path}`,
                });
                if (pause) {
                    setPlayerState(PlayerState.paused);
                    return Promise.resolve(true);
                } else {
                    return Promise.reject(new Error(`error in pause player for path: ${path}`));
                }
            } catch (error) {
                return Promise.reject(error);
            }
        } else {
            return Promise.reject(new Error('error in pausing player: mode is not static'));
        }
    };

    const startRecordingAction = async (args) => {
        if (mode === 'live') {
            try {
                const hasPermission = await checkHasAudioRecorderPermission();

                if (hasPermission === PermissionStatus.granted) {
                    const start = await startRecording(args);
                    if (!isNil(start) && start) {
                        setRecorderState(RecorderState.recording);
                        return Promise.resolve(true);
                    } else {
                        return Promise.reject(new Error('error in start recording action'));
                    }
                } else {
                    return Promise.reject(new Error('error in start recording: audio recording permission is not granted'));
                }
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error('error in start recording: mode is not live'));
        }
    };

    const stopRecordingAction = async () => {
        if (mode === 'live') {
            try {
                const data = await stopRecording();
                if (!isNil(data) && !isEmpty(data)) {
                    setWaveform([]);
                    const pathData = head(data);
                    if (!isNil(pathData)) {
                        setRecorderState(RecorderState.stopped);
                        return Promise.resolve(pathData);
                    } else {
                        return Promise.reject(new Error('error in stopping recording. can not get path of recording'));
                    }
                } else {
                    return Promise.reject(new Error('error in stopping recording. can not get path of recording'));
                }
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error('error in stop recording: mode is not live'));
        }
    };

    const pauseRecordingAction = async () => {
        if (mode === 'live') {
            try {
                const pause = await pauseRecording();
                if (!isNil(pause) && pause) {
                    setRecorderState(RecorderState.paused);
                    return Promise.resolve(pause);
                } else {
                    return Promise.reject(new Error('Error in pausing recording audio'));
                }
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error('error in pause recording: mode is not live'));
        }
    };

    const resumeRecordingAction = async () => {
        if (mode === 'live') {
            try {
                const hasPermission = await checkHasAudioRecorderPermission();
                if (hasPermission === PermissionStatus.granted) {
                    const resume = await resumeRecording();
                    if (!isNil(resume)) {
                        setRecorderState(RecorderState.recording);
                        return Promise.resolve(resume);
                    } else {
                        return Promise.reject(new Error('Error in resume recording'));
                    }
                } else {
                    return Promise.reject(new Error('error in resume recording: audio recording permission is not granted'));
                }
            } catch (err) {
                return Promise.reject(err);
            }
        } else {
            return Promise.reject(new Error('error in resume recording: mode is not live'));
        }
    };

    useEffect(() => {
        if (!isNil(viewLayout?.width)) {
            const getNumberOfSamples = floor(
                (viewLayout?.width ?? 0) / (candleWidth + candleSpace)
            );
            setNoOfSamples(getNumberOfSamples);
            if (mode === 'static') {
                getAudioWaveFormForPath(getNumberOfSamples);
            }
        }
    }, [viewLayout, mode, candleWidth, candleSpace]);

    useEffect(() => {
        if (!isNil(seekPosition)) {
            if (mode === 'static') {
                const seekAmount = (seekPosition?.pageX - (viewLayout?.x ?? 0)) / (viewLayout?.width ?? 1);
                const clampedSeekAmount = clamp(seekAmount, 0, 1);

                if (!panMoving) {
                    seekToPlayer({
                        playerKey: `PlayerFor${path}`,
                        progress: clampedSeekAmount * songDuration,
                    });
                    if (playerState === PlayerState.playing) {
                        startPlayerAction();
                    }
                }

                setCurrentProgress(clampedSeekAmount * songDuration);
            }
        }
    }, [seekPosition, panMoving, mode, songDuration]);

    useEffect(() => {
        const tracePlayerState = onDidFinishPlayingAudio(async data => {
            if (data.playerKey === `PlayerFor${path}`) {
                if (data.finishType === FinishMode.stop) {
                    setPlayerState(PlayerState.stopped);
                    setCurrentProgress(0);
                    await preparePlayerForPath();
                }
            }
        });

        const tracePlaybackValue = onCurrentDuration(data => {
            if (data.playerKey === `PlayerFor${path}`) {
                setCurrentProgress(data.currentDuration);
            }
        });

        const traceRecorderWaveformValue = onCurrentRecordingWaveformData(result => {
            if (mode === 'live') {
                if (!isNil(result.currentDecibel)) {
                    setWaveform(prev => [...prev, result.currentDecibel]);
                    if (scrollRef.current) {
                        scrollRef.current.scrollToEnd({animated: true});
                    }
                }
            }
        });

        return () => {
            tracePlayerState.remove();
            tracePlaybackValue.remove();
            traceRecorderWaveformValue.remove();
        };
    }, []);

    useEffect(() => {
        if (!isNil(onPlayerStateChange)) {
            onPlayerStateChange(playerState);
        }
    }, [playerState]);

    useEffect(() => {
        if (!isNil(onRecorderStateChange)) {
            onRecorderStateChange(recorderState);
        }
    }, [recorderState]);

    useEffect(() => {
        if (panMoving) {
            if (playerState === PlayerState.playing) {
                pausePlayerAction();
            }
        } else {
            if (playerState === PlayerState.paused) {
                startPlayerAction();
            }
        }
    }, [panMoving]);

    const panResponder = useRef(
        PanResponder.create({
            onMoveShouldSetPanResponder: () => true,
            onPanResponderGrant: () => {
                setPanMoving(true);
                onPanStateChange(true);
            },
            onPanResponderStart: () => {
            },
            onPanResponderMove: event => {
                setSeekPosition(event.nativeEvent);
            },
            onPanResponderEnd: () => {
                onPanStateChange(false);
                setPanMoving(false);
            },
        })
    ).current;

    useEffect(() => {

        console.log(onCurrentProgressChange)

        if (!isNil(onCurrentProgressChange)) {
            onCurrentProgressChange(currentProgress, songDuration);
        }
    }, [currentProgress, songDuration, onCurrentProgressChange]);

    useImperativeHandle(ref, () => ({
        startPlayer: startPlayerAction,
        stopPlayer: stopPlayerAction,
        pausePlayer: pausePlayerAction,
        resumePlayer: startPlayerAction,
        startRecord: startRecordingAction,
        pauseRecord: pauseRecordingAction,
        stopRecord: stopRecordingAction,
        resumeRecord: resumeRecordingAction,
    }));

    return (
        <View style={containerStyle}>
            <View
                ref={viewRef}
                style={{alignItems: 'center', flexDirection: 'row', height: '100%',}}
                onLayout={() => {
                    viewRef.current?.measure((_x, _y, width, height, pageX, pageY) => {
                        setViewLayout({height, width, x: pageX, y: pageY});
                    });
                }}
                {...(mode === 'static' ? panResponder.panHandlers : {})}>
                <ScrollView
                    horizontal
                    ref={scrollRef}
                    style={{height: '100%'}}
                    scrollEnabled={mode === 'live'}>
                    {waveform.map((amplitude, indexCandle) => (
                        <CustomWaveFromCandle
                            key={indexCandle}
                            index={indexCandle}
                            amplitude={amplitude}
                            parentViewLayout={viewLayout}
                            {...{
                                candleWidth,
                                candleSpace,
                                noOfSamples,
                                songDuration,
                                currentProgress,
                                waveColor,
                                scrubColor,
                                candleHeightScale,
                            }}
                        />
                    ))}
                </ScrollView>
            </View>
        </View>
    );
});

export default React.memo(CustomWaveform);
MwSpaceLLC commented 1 month ago

so, bad performance for android, in WaveComponent: onLayout

The measure method could be expensive if executed frequently. Ensure that onLayout is not called unnecessarily and that the measurement calculations are performed only when strictly necessary.

i tried. also save waveForm. to change method:

onLayout={() => {
                    viewRef.current?.measure((_x, _y, width, height, pageX, pageY) => {
                        setViewLayout({height, width, x: pageX, y: pageY});
                    });
                }}

####### change whith

onLayout={(event) => {
        const {width, height, x, y} = event.nativeEvent.layout;
        setViewLayout({height, width, x, y});
                }}

Certainly, saving the wave data in local storage has slightly improved performance on Android, but the user still has to wait for all the messages to load.

Ultimately, with these improvements to the WaveForm component, by saving the extractWaveformData and not calling it again, and by optimizing the use of render, there are visible improvements in the video.

https://github.com/user-attachments/assets/5e946a64-1c52-4201-9c36-c7886b806f4e

MwSpaceLLC commented 1 month ago

[BUG ALSO] | ANDROID NOT PLAY WHEN DURATION NOT CALCULATED

anatooly commented 1 month ago
onCurrentProgressChange={(
  currentProgress: number,
  songDuration: number,
) => {
  console.log({ currentProgress, songDuration })   

  {"currentProgress": "9388", "songDuration": "5048"}

currentProgress > songDuration

P.S.: And I think on ios return type integer but on android is string.

anatooly commented 1 month ago

@MwSpaceLLC How you fix this issue? :-)