xiqi / react-native-live-audio-stream

Get live audio stream data for React Native (works for iOS and Android)
MIT License
71 stars 29 forks source link

Attempt to invoke virtual method 'void andoid.mediua.AudioRecord.startRecvording()' on a null object reference #27

Closed eedeebee closed 4 months ago

eedeebee commented 4 months ago

Here's my code (or the relevant pieces of it) using RN 0.72 on an Android:

 const options = {
            sampleRate: 16000,  // default is 44100 but 32000 is adequate for accurate voice recognition
            channels: 1,        // 1 or 2, default 1
            bitsPerSample: 16,  // 8 or 16, default 16
            audioSource: 6,     // android only 
            bufferSize: 4096,    // default is 2048
            hasAudioHeader: false,
        };

        LiveAudioStream.init(options);

        LiveAudioStream.on('data', audio => {

            // Example usage
            const data = Array.from(base64ToSigned16BitIntArray(audio));
            logHighestAndLowest(data);

            if (data.length != 4096) {
                console.log("Short buffer");
                return;
            }
            ... more code

        });

And...

const start = () => {
        console.log("start");
        LiveAudioStream.start()
    };

When I hit start I see:

02-27 17:24:52.635 23062 24405 E unknown:ReactNative: java.lang.NullPointerException: Attempt to invoke virtual method 'void android.media.AudioRecord.startRecording()' on a null object reference
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.imxiqi.rnliveaudiostream.RNLiveAudioStreamModule.start(RNLiveAudioStreamModule.java:92)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at java.lang.reflect.Method.invoke(Native Method)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.facebook.react.bridge.JavaMethodWrapper.invoke(JavaMethodWrapper.java:372)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.facebook.react.bridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:188)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.facebook.jni.NativeRunnable.run(Native Method)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at android.os.Handler.handleCallback(Handler.java:958)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at android.os.Handler.dispatchMessage(Handler.java:99)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:27)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at android.os.Looper.loopOnce(Looper.java:205)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at android.os.Looper.loop(Looper.java:294)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at com.facebook.react.bridge.queue.MessageQueueThreadImpl$4.run(MessageQueueThreadImpl.java:228)
02-27 17:24:52.635 23062 24405 E unknown:ReactNative:   at java.lang.Thread.run(Thread.java:1012)
eedeebee commented 4 months ago

Sorry this bug was in a fork of the project that didn't have its ownn issues. Closing.

sonhmivirse commented 1 month ago

Dont know why you close ? @eedeebee . I run into this problem too. This is my code in expo :

import { StatusBar } from "expo-status-bar";
import { useEffect, useState } from "react";
import { Button, PermissionsAndroid, Platform, StyleSheet, Text, View } from "react-native";
import LiveAudioStream from "react-native-live-audio-stream";

const requestMicrophonePermission = async () => {
  if (Platform.OS === "android") {
    try {
      const granted = await PermissionsAndroid.request(
        PermissionsAndroid.PERMISSIONS.RECORD_AUDIO,
        {
          title: "Microphone Permission",
          message: "App needs access to your microphone to record audio.",
          buttonNeutral: "Ask Me Later",
          buttonNegative: "Cancel",
          buttonPositive: "OK",
        }
      );
      return granted === PermissionsAndroid.RESULTS.GRANTED;
    } catch (err) {
      console.warn(err);
      return false;
    }
  }
  return true;
};

export default function App() {
  const [isStreaming, setIsStreaming] = useState(false);

  useEffect(() => {
    (async () => {
      const hasPermission = await requestMicrophonePermission();
      if (!hasPermission) {
        alert("Microphone permission is required to use this feature.");
      }
    })();
  }, []);

  const startStreaming = () => {
    console.log(LiveAudioStream)
    LiveAudioStream.init({
      sampleRate: 16000, // default is 44100 but it's best to have a lower one for better performance.
      channels: 1, // 1 or 2, defaults to 1
      bitsPerSample: 16, // 8 or 16, defaults to 16
      audioSource: 6, // android only (see below)
      bufferSize: 4096, // default is 2048
    });
    LiveAudioStream.start();
    setIsStreaming(true);
  };

  const stopStreaming = () => {
    LiveAudioStream.stop();
    setIsStreaming(false);
  };
  return (
    <View style={{ flex: 1, justifyContent: "center", alignItems: "center" }}>
      <Text>Live Audio Stream Example</Text>
      <Button
        title={isStreaming ? "Stop Streaming" : "Start Streaming"}
        onPress={isStreaming ? stopStreaming : startStreaming}
      />
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: "#fff",
    alignItems: "center",
    justifyContent: "center",
  },
});