I have been playing around with saving what is seen on a canvas into a video file using the new WebCodec APIs.
I have been successful at creating a webm file using the webm-muxer project.
But I'm now looking into generating a mp4 instead.
This is the code for using webm-muxer to record a htmlcanvas into a webm file.
import WebMMuxer from 'webm-muxer'
// to get showSaveFilePicker, and the VideoEncoder types, install and add this to tsconfig.json:
// "types": ["@types/wicg-file-system-access", "@types/dom-webcodecs"],
type CurrentRecordingCallback = () => void
let stopRecordingCallback: CurrentRecordingCallback | undefined
let renderFrame = false
export interface RecordOptions {
canvas: HTMLCanvasElement
width: number
height: number
}
export async function record(options: RecordOptions) {
let fileHandle = await window.showSaveFilePicker({
suggestedName: `myvideo.webm`,
types: [
{
description: 'Video File',
accept: { 'video/webm': ['.webm'] },
},
],
})
let fileWritableStream = await fileHandle.createWritable()
let muxer = new WebMMuxer({
target: fileWritableStream,
video: {
//codec: 'V_AV1',
codec: 'V_VP9',
width: options.width,
height: options.height,
},
})
let videoEncoder = new VideoEncoder({
output: (chunk, meta) => muxer.addVideoChunk(chunk, meta),
error: (e) => console.error(e),
})
videoEncoder.configure({
// av01 is the codec AV1
// 0 indicates Profile 0
// 16M indicates level identifier 16 (Level 6.0), can also be 8M, 4M
// 08 indicates the bit depth (8-bit)
//codec: 'av01.0.31M.10',
//codec: 'av01.0.16M.08',
codec: 'vp09.00.10.08',
width: options.width,
height: options.height,
displayWidth: options.width,
displayHeight: options.height,
bitrateMode: 'variable',
bitrate: 20_000_000,
framerate: 60,
// bitrateMode: 'constant',
latencyMode: 'quality',
// hardwareAcceleration: 'prefer-hardware',
})
const startTime = performance.now()
let currentFrame = 0
const fps = 60
const frameTime = 1 / fps
const frameTimeMicroSeconds = Math.floor(frameTime * 1_000_000)
const render = (timestamp: number = performance.now()) => {
timestamp = currentFrame * frameTimeMicroSeconds
const duration = frameTimeMicroSeconds
const frame = new VideoFrame(options.canvas, {
displayWidth: options.width,
displayHeight: options.height,
timestamp,
duration,
})
// create a keyframe every 150 frames
const keyFrame = currentFrame % 150 == 0
videoEncoder.encode(frame, { keyFrame })
frame.close()
currentFrame++
renderFrame && requestAnimationFrame(render)
}
stopRecordingCallback = async () => {
const duration = performance.now() - startTime
renderFrame = false
stopRecordingCallback = undefined
await videoEncoder.flush()
videoEncoder.close()
muxer.finalize()
fileWritableStream.close()
}
// start rendering
renderFrame = true
render()
return stopRecordingCallback
}
export async function stopRecording() {
if (stopRecordingCallback) {
await stopRecordingCallback()
}
}
I know how to reconfigure the VideoEncoder to output h264 data instead, but I'm not sure on how to setup the mp4 muxer and pass data to it, as I do above with muxer.addVideoChunk
How could I do something like this using your library?
Hi!
I have been playing around with saving what is seen on a canvas into a video file using the new WebCodec APIs.
I have been successful at creating a webm file using the
webm-muxer
project. But I'm now looking into generating a mp4 instead.This is the code for using webm-muxer to record a htmlcanvas into a webm file.
I know how to reconfigure the VideoEncoder to output h264 data instead, but I'm not sure on how to setup the mp4 muxer and pass data to it, as I do above with
muxer.addVideoChunk
How could I do something like this using your library?