streamproc / MediaStreamRecorder

Cross browser audio/video/screen recording. It supports Chrome, Firefox, Opera and Microsoft Edge. It even works on Android browsers. It follows latest MediaRecorder API standards and provides similar APIs.
https://www.webrtc-experiment.com/msr/
MIT License
2.63k stars 562 forks source link

Audio ogg vs wav vs webm file size #16

Open sorora opened 10 years ago

sorora commented 10 years ago

Hello,

I've been trying to use the audio recording features of this library however I've noticed that no matter what mime type / file type I save it to/configure (ogg, wav or webm) it ends up being the same size. Is there an actual difference in how it's encoding these audio files? As it ends up being ~1MB for just 5 or 6 seconds or audio...

Thanks

muaz-khan commented 10 years ago
  1. Chrome currently unable to capture microphone with stereo channels. Mono is the only format captured by chrome. Chrome uses custom wrappers to convert mono audio into stereo format i.e. dual channels. This is one of the biggest factors in the audio latency.
  2. If you explorer chromium code; you'll see that some APIs can only be successfully called for WAV files with stereo audio. Stereo audio is only supported for WAV files.

Though we can convert stereo back into mono; however we can't encode mono audio into WAV container. We need to choose mp3 or other formats.

Chrome is still missing native mp3 encoders. You need to be using Empcrypten/etc. implementation in JavaScript.

Both MediaStreamRecorder.js and RecordRTC.js are using bufferSize=4096. Buffer-size is passed as 1st argument over createScriptProcessor.

if (context.createJavaScriptNode) {
    __stereoAudioRecorderJavacriptNode = context.createJavaScriptNode(bufferSize, 2, 2);
} else if (context.createScriptProcessor) {
    __stereoAudioRecorderJavacriptNode = context.createScriptProcessor(bufferSize, 2, 2);
} else {
    throw 'WebAudio API has no support on this browser.';
}

Possible values for Buffer-size are:

[256, 512, 1024, 2048, 4096, 8192, 16384]

From the spec: This value controls how frequently the audioprocess event is dispatched and how many sample-frames need to be processed each call. Lower values for buffer size will result in a lower (better) latency. Higher values will be necessary to avoid audio breakup and glitches

The size of the buffer (in sample-frames) which needs to be processed each time onprocessaudio is called.

You can reduce WAV file's size using low-bufferSize however it'll result in audio quality lost.