naudio / NAudio

Audio and MIDI library for .NET
MIT License
5.59k stars 1.1k forks source link

The wasapiLoopback audio gets received and is sounds like roaring and mumering, mic audio works fine. #1039

Open JanssonTheBest opened 1 year ago

JanssonTheBest commented 1 year ago

public static bool sendingFrames = false; private const int ChunkSize = 65536; // Adjust the chunk size as needed

    public static async void StartAudioStreaming()
    {
        // Create an instance of WaveInEvent to capture microphone audio
        WaveInEvent microphoneWaveInEvent = new WaveInEvent();
        microphoneWaveInEvent.WaveFormat = new WaveFormat(44100, 1); // 44.1 kHz, mono

        // Create an instance of WasapiLoopbackCapture to capture speaker audio
        WasapiLoopbackCapture speakerCapture = new WasapiLoopbackCapture();

        // Event handler for microphone audio data received
        microphoneWaveInEvent.DataAvailable += (sender, args) =>
        {
            Task.Run(async () =>
            {
                byte[] audioData = new byte[args.BytesRecorded];
                Buffer.BlockCopy(args.Buffer, 0, audioData, 0, args.BytesRecorded);

                // Convert the audio data to a base64-encoded string
                string base64Audio = Convert.ToBase64String(audioData);

                // Split the base64-encoded string into chunks
                const int ChunkSize = 10000; // Adjust the chunk size as needed
                for (int i = 0; i < base64Audio.Length; i += ChunkSize)
                {
                    int remainingBytes = Math.Min(ChunkSize, base64Audio.Length - i);
                    string chunk = base64Audio.Substring(i, remainingBytes);

                    // Add the audio indicator and delimiters for microphone audio
                    string audioChunk = "§RemoteStart§" + "§IA§" + chunk + "§RemoteEnd§";

                    // Send the audio chunk to the client
                    ClientSession.SendData(audioChunk).Wait(); // Use .Wait() to block the async method

                    // Optional delay between audio chunks (adjust as needed)
                    await Task.Delay(1);
                }
            });

        };

        // Event handler for speaker audio data received
        speakerCapture.DataAvailable += (sender, args) =>
        {
            Task.Run(async () =>
            {
                byte[] audioData = new byte[args.BytesRecorded];
                Buffer.BlockCopy(args.Buffer, 0, audioData, 0, args.BytesRecorded);

                // Convert the audio data to a base64-encoded string
                string base64Audio = Convert.ToBase64String(audioData);

                // Split the base64-encoded string into chunks
                const int ChunkSize = 10000; // Adjust the chunk size as needed
                for (int i = 0; i < base64Audio.Length; i += ChunkSize)
                {
                    int remainingBytes = Math.Min(ChunkSize, base64Audio.Length - i);
                    string chunk = base64Audio.Substring(i, remainingBytes);

                    // Add the audio indicator and delimiters for speaker audio
                    string audioChunk = "§RemoteStart§" + "§OA§" + chunk + "§RemoteEnd§";

                    // Send the audio chunk to the client
                    ClientSession.SendData(audioChunk).Wait(); // Use .Wait() to block the async method

                    // Optional delay between audio chunks (adjust as needed)
                    await Task.Delay(1);
                }
            });
        };

        // Start capturing microphone audio
        microphoneWaveInEvent.StartRecording();

        // Start capturing speaker audio
        speakerCapture.StartRecording();

        // Wait for the user to stop audio capture
        while (sendingFrames && !ClientSession.noConnection)
        {
            await Task.Delay(1);
        }

        // Stop capturing microphone audio
        microphoneWaveInEvent.StopRecording();
        microphoneWaveInEvent.Dispose();

        // Stop capturing speaker audio
        speakerCapture.StopRecording();
        speakerCapture.Dispose();
    }
markheath commented 1 year ago

You need to pay attention to the format that the audio is being captured in. You are likely capturing 16 bit audio with the microphone, and WASAPI loopback will be capturing 32 bit IEEE floating point samples (stereo, at 44.1kHz or 48kHz)