DeltaCircuit / react-media-recorder

react-media-recorder is a react component with render prop that can be used to record audio/video streams using MediaRecorder API.
https://npmjs.com/react-media-recorder
MIT License
467 stars 123 forks source link

audio file format #31

Open satyajitghana opened 3 years ago

satyajitghana commented 3 years ago

Hi,

I am trying to record audio in the browser

    const {
        status,
        startRecording,
        stopRecording,
        mediaBlobUrl,
    } = useReactMediaRecorder({
        video: false,
        audio: true,
        blobPropertyBag: {
            type: "audio/wav"
        }
    });

and fetching the file from it

            const audioBlob = await fetch(mediaBlobUrl).then(r => r.blob());

            console.log(audioBlob);

            const audiofile = new File([audioBlob], `${uuidv4()}.wav`, { type: "audio/wav" })

Now i saved this file in disk, and used file audio.wav

and it shows up as as WebM file

$ file b61508fa-9c4e-47fc-bb54-5def7abf9bfc.wav                                                         ─╯
b61508fa-9c4e-47fc-bb54-5def7abf9bfc.wav: WebM

what i need is a wav file ! and it is giving me a WebM file

gdrbyKo1 commented 3 years ago

Hi. As far as I can tell, the MediaRecorder API currently doesn't support the audio/wav MIME type. This is unrelated to react-media-recorder, please see https://github.com/w3c/mediacapture-record/issues/198 and the related Firefox and Chromium issues for details.

satyajitghana commented 3 years ago

oh okay, thanks, so probably the mimeType has to be changed in your library ? it's kind of misleading.

also i was wondering, can the bytes in the recorded file be changed to audio/wav, i.e. can we convert audio/webm to audio/wav on the client side ?

DeltaCircuit commented 3 years ago

blobPropertyBag is used only for generating the blobs (after the recording has been completed) inside the media recorder. If you want to use a specific codec to use, you need to pass through mediaRecorderOptions. It'll check for the browser support first and will emit a console.error if there's no support. I'd say not to use the blobPropertyBag if it's not a big-deal. It's just a placeholder for extending the Blob object. Or else we need to keep them in-sync.

That being said, currently this library defaults to video/mp4 / audio/wav while generating the blob, which is kinda incorrect. That needs to be fixed.

For a quick conversion, I'd suggest ffmepg, which has webm --> wav support I believe.

satyajitghana commented 3 years ago

yes thanks, ffmpeg worked ! added it on the server side though, and just realised I could have done it on client side instead.

anyways I was able to convert webm to wav

liadbelad commented 3 years ago

Hey I got the same issue and need to convert to wav file in client side so I can send a wav file to a server. How can I implement using ffmpeg in the client side please?

DeltaCircuit commented 3 years ago

ffmpeg's usage is pretty straight forward. Download it from here and use it like this:

ffmpeg -i <input_webm> -c:a pcm_f32le .<output>.wav  
satyajitghana commented 3 years ago

@liadbelad not sure if this would work, https://www.npmjs.com/package/ffmpeg

gdrbyKo1 commented 3 years ago

@liadbelad for client-side, there's https://github.com/Kagami/ffmpeg.js/ and https://github.com/ffmpegwasm/ffmpeg.wasm But are you sure you really want to do that? Sending uncompressed audio data over a network will require significantly more bandwidth. It might be a better choice to do the decoding on the receiving side.

no-1ne commented 3 years ago

Ffmpeg is an overkill..try this https://github.com/ai/audio-recorder-polyfill it has wav encoder and mp3 encoder works in iOS world as well

guest271314 commented 3 years ago

Hi. As far as I can tell, the MediaRecorder API currently doesn't support the audio/wav

Note, Chrome does support x-matroska;codecs=pcm where the PCM can be extracted, see https://github.com/WebAudio/web-audio-api-v2/issues/63.

Using Web Audio API a MediaStream can be set at MediaStreamAudioSourceNode or at HTMLMediaElement then connected to an AudioWorklet where the Float32Arrays from inputs can be set at an Array, ArrayBuffer, SharedArrayBuffer, or other means of storing data, then WAV headers can be set, e.g., https://github.com/ai/audio-recorder-polyfill/issues/7#issuecomment-744066897.

vinikatyal commented 2 years ago

Does this still not work?

guest271314 commented 2 years ago

This is what I used the last time I needed to convert floats to WAV https://github.com/guest271314/webcodecs/blob/main/WavAudioEncoder.js, usage

  const wav = await new WavAudioEncoder({
    sampleRate: 48000,
    numberOfChannels: 1,
    buffers: [new Float32Array(48000)],
  }).encode();
olegpanchenko commented 1 year ago
 useReactMediaRecorder({ mediaRecorderOptions: { mimeType: 'audio/wav' } });
ramonrovirosa commented 1 year ago

In case it helps anyone in the future, I posted my solution using ffmpeg here https://github.com/0x006F/react-media-recorder/issues/101#issuecomment-1499626166

renny-ren commented 4 months ago

blobPropertyBag is used only for generating the blobs (after the recording has been completed) inside the media recorder. If you want to use a specific codec to use, you need to pass through mediaRecorderOptions. It'll check for the browser support first and will emit a console.error if there's no support. I'd say not to use the blobPropertyBag if it's not a big-deal. It's just a placeholder for extending the Blob object. Or else we need to keep them in-sync.

That being said, currently this library defaults to video/mp4 / audio/wav while generating the blob, which is kinda incorrect. That needs to be fixed.

For a quick conversion, I'd suggest ffmepg, which has webm --> wav support I believe.

This should be written in the document 😀 I was thought blobPropertyBag could do the conversion.