bbc / VideoContext

An experimental HTML5 & WebGL video composition and rendering API.
http://bbc.github.io/VideoContext/
Apache License 2.0
1.33k stars 157 forks source link

Suggestion: Exporting video via MediaStream #156

Open MysteryPancake opened 5 years ago

MysteryPancake commented 5 years ago

I'm not sure how well this would work, but the MediaStream interface allows the capturing of canvas content: https://developers.google.com/web/updates/2016/10/capture-stream

Small example: https://webrtc.github.io/samples/src/content/capture/canvas-record/

The audio and video tracks could be recorded using this method: https://stackoverflow.com/a/39302994

For timing the render so every frame is captured, requestFrame could be used: https://developer.mozilla.org/en-US/docs/Web/API/CanvasCaptureMediaStreamTrack/requestFrame

This could potentially fix https://github.com/bbc/VideoContext/issues/76 and https://github.com/bbc/VideoContext/issues/124

PTaylour commented 5 years ago

This looks interesting, definitely worth a shot.

We're concentrating on getting the currently open PRs merge at the moment, so I'm unlike to be able to dedicate some time to this for a while yet.

However, if anyone is up for trying it out in the interim I'm very interested in the results!

jo-hnny commented 5 years ago

I'm do the same thing, this work good, next thing is record audio, but export full video need the video play to end ,have some method speed up ?

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <meta http-equiv="X-UA-Compatible" content="ie=edge" />
    <title>Document</title>

    <style>
      * {
        padding: 0;
        margin: 0;
      }

      html {
        width: 100%;
        height: 100%;
      }

      body {
        width: 100%;
        height: 100%;
      }

      canvas {
        display: block;
        width: 1080px;
        height: 720px;
        margin: 0 auto;
      }

      .btns {
        width: 100%;
        height: 50px;
        display: flex;
        justify-content: space-around;
      }

      .btn {
        background-color: #000;
        color: #fff;
        cursor: pointer;

        width: 100px;
        height: 100%;

        display: flex;
        justify-content: center;
        align-items: center;
      }
    </style>
  </head>
  <body>
    <canvas width="1280" height="720"></canvas>

    <div class="btns">
      <div class="btn play">play</div>

      <div class="btn stop">stop</div>

      <div class="btn download">download</div>
    </div>

    <script src="http://bbc.github.io/VideoContext/dist/videocontext.js"></script>

    <script>
      class Record {
        constructor(canvas, { videoType = "webm" } = {}) {
          this.canvas = canvas
          this.videoType = videoType

          this.init()
        }

        init() {
          const stream = this.canvas.captureStream()

          this.mediaRecorder = new MediaRecorder(stream, {
            mimeType: "video/webm"
          })

          this.recordedBlobs = []

          this.mediaRecorder.ondataavailable = this.handleDataAvailable
        }

        start() {
          this.mediaRecorder.start()
        }

        stop() {
          this.mediaRecorder.stop()
        }

        handleDataAvailable = event => {
          if (event.data && event.data.size > 0) {
            this.recordedBlobs.push(event.data)
          }
        }

        download(name) {
          const blob = new Blob(this.recordedBlobs, { type: "video/webm" })
          const url = window.URL.createObjectURL(blob)
          const a = document.createElement("a")
          a.href = url
          a.download = `${name}.${this.videoType}`
          a.click()
          window.URL.revokeObjectURL(url)
        }
      }

      const bindPlay = videoContext => {
        const playBtn = document.querySelector(".play")

        const stopBtn = document.querySelector(".stop")

        const downloadBtn = document.querySelector(".download")

        const record = new Record(videoContext._canvas)

        console.log(record)

        playBtn.addEventListener("click", () => {
          console.log("start")
          videoContext.play()

          record.start()
        })

        stopBtn.addEventListener("click", () => {
          console.log("stop")
          videoContext.pause()

          record.stop()
        })

        downloadBtn.addEventListener("click", () => {
          console.log("download")
          record.download("test")
        })
      }

      const createEffectNodes = (videoContext, n) => {
        const {
          MONOCHROME,
          HORIZONTAL_BLUR,
          COLORTHRESHOLD,
          AAF_VIDEO_FLIP
        } = VideoContext.DEFINITIONS

        const effects = [
          MONOCHROME,
          HORIZONTAL_BLUR,
          COLORTHRESHOLD,
          AAF_VIDEO_FLIP
        ]

        return [...new Array(n)].map(() =>
          videoContext.effect(
            effects[Math.round(Math.random() * effects.length)]
          )
        )
      }

      const start = () => {
        const canvas = document.querySelector("canvas")
        const videoContext = new VideoContext(canvas)

        const videoNode = videoContext.video(
          "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4"
        )
        videoNode.startAt(0)

        const effectNodes = createEffectNodes(videoContext, 2)

        effectNodes
          .concat([videoContext.destination])
          .reduce((preNode, currentNode) => {
            preNode.connect(currentNode)

            return currentNode
          }, videoNode)

        bindPlay(videoContext)
      }

      window.onload = start
    </script>
  </body>
</html>
MysteryPancake commented 5 years ago

I noticed sdobz and some others used a similar method in https://github.com/bbc/VideoContext/issues/124 as well. I would like to test out requestFrame, as I feel this could improve the timing. I may experiment with it later

PTaylour commented 5 years ago

@MysteryPancake did you get a chance to test out requestFrame?

MysteryPancake commented 5 years ago

Not yet - I'm not sure about determining the framerate. It would be good if seekToNextFrame was widely supported

PTaylour commented 5 years ago

I'm not sure about determining the framerate

Yeah VideoContext has no knowledge of the framerate it just updates on each animationFrame (or tick in the webworker).

Depending on your inputs you could have elements running at different frame rates anyway.

So maybe it's a decision the user has to make?

thenikso commented 4 years ago

Any update on this one? I've tried it myself but adding the audio stream seams quite tricky

guest271314 commented 4 years ago

What is the requirement?

thenikso commented 4 years ago

I believe the ultimate goal here would be to have an api like ctx.saveToFile('webm') that saves the video+audio of the VideoContext to a file in a specified format