muaz-khan / WebRTC-Experiment

WebRTC, WebRTC and WebRTC. Everything here is all about WebRTC!!
https://www.webrtc-experiment.com/
MIT License
11.73k stars 3.95k forks source link

Basic example from npm - audio doesn't work #643

Closed methodbox closed 4 years ago

methodbox commented 4 years ago

I'm not sure if I'm missing something, but this example from npm seems to only record video.

Can someone answer how to capture the audio, too? It seems like it should, based on the config, but all I get is webm without audio.

navigator.mediaDevices.getUserMedia({
    video: true,
    audio: true
}).then(async function(stream) {
    let recorder = RecordRTC(stream, {
        type: 'video'
    });
    recorder.startRecording();

    const sleep = m => new Promise(r => setTimeout(r, m));
    await sleep(3000);

    recorder.stopRecording(function() {
        let blob = recorder.getBlob();
        invokeSaveAsDialog(blob);
    });
});
methodbox commented 4 years ago

I should mention I'm using this for getDisplayMedia in React like this:

_startScreenCapture() {
    navigator.mediaDevices
      .getDisplayMedia({
        video: true,
        audio: true
      })
      .then(stream => {
        let recorder = RecordRTC(stream, {
          type: "video"
        });
        recorder.startRecording();
        this.setState({ recorder: recorder, stream: stream });
      });
  }

I think the issue is I need to also use getUserMedia and add an audio track, but I'm unclear how to achieve this.

methodbox commented 4 years ago

For anyone looking for a solution, I figured this out by referencing this page:

https://jmperezperez.com/mediarecorder-api-screenflow/

And this issue: https://github.com/muaz-khan/RecordRTC/issues/181

Long story short, you need to create one new, empty stream, two source tracks, one for video and one for audio, and then use addTrack() to add those tracks to the stream, then feed the stream to the recorder.

Source Tracks > Empty Stream > Recorder

This was my solution (used in React):

_startScreenCapture() {
    const videoSource = () =>
      navigator.mediaDevices.getDisplayMedia({
        video: { mediaSource: "screen" }
      });

    const audioSource = () =>
      navigator.mediaDevices.getUserMedia({ audio: true });

    videoSource().then(vid => {
      audioSource()
        .then(audio => {
          const combinedStream = new MediaStream();
          const vidTrack = vid.getVideoTracks()[0];
          const audioTrack = audio.getAudioTracks()[0];

          combinedStream.addTrack(vidTrack);
          combinedStream.addTrack(audioTrack);
          return combinedStream;
        })
        .then(stream => {
          console.log(stream);
          let recorder = RecordRTC(stream, {
            // audio, video, canvas, gif
            type: "video",
            mimeType: "video/webm",
            recorderType: MediaStreamRecorder,
            disableLogs: true,
            timeSlice: 1000,
            bitsPerSecond: 128000,
            audioBitsPerSecond: 128000,
            videoBitsPerSecond: 128000,
            frameInterval: 90
          });
          recorder.startRecording();
          this.setState({ recorder: recorder, stream: stream });
        });
    });
  }