Vinlic / WebVideoCreator

🌈 A framework for rendering web animations into videos. It's implemented based on Node.js + Puppeteer + Chrome + FFmpeg, utilizing the latest browser APIs.
Apache License 2.0
137 stars 33 forks source link

Real-Time Animation Rendering #11

Closed Br1el closed 12 months ago

Br1el commented 1 year ago

Problem Description

I am encountering an issue with synchronizing real-time audio-driven animation when capturing video using WebVideoCreator. The animation is based on audio frequency data and is rendered in real time in a browser environment.

The visualizer, implemented in JavaScript, creates a series of bars whose heights vary according to the frequency data obtained from the audio. This is achieved using the Web Audio API and the requestAnimationFrame method for smooth animations.

The problem I'm facing is capturing the real-time animation synchronously with the audio. When attempting to record the animation using tools like the Web Video Capture (WVC) API, I encounter synchronization issues where the audio and the visual elements are not perfectly aligned in the final video output. This desynchronization becomes more pronounced as the recording progresses, leading to a noticeable lag between the audio and the visual animation.

Attempted Solutions

  1. FPS Logging and Performance Tweaks: I've added code to log the frames per second (FPS) to monitor the performance and ensure the animation runs smoothly. Adjustments were made to optimize the animation's performance.

  2. Code Refinements: Throughout the development, I've tried several refinements to the code, aiming to optimize both the audio processing and the animation rendering.

  3. Different Capture Methods with WVC: I tried various capturing setups using the Web Video Capture API to try and resolve the synchronization issues. Also running on Macbook or AWS EC2 instances gave similar results.

Code Snippets:

This is my audio processing setup:

// Create an AudioContext
const audioContext = new AudioContext();

// Create an AnalyserNode
const analyser = audioContext.createAnalyser();

// Configure the analyser
analyser.fftSize = 2048;
analyser.maxDecibels = 80;

// Fetch the audio file and start the animation
fetch(audioUrl)
  .then(response => response.arrayBuffer())
  .then(arrayBuffer => audioContext.decodeAudioData(arrayBuffer))
  .then(startAnimation)
  .catch(error => console.error('Error loading audio:', error));

This is my animation loop:

function animate() {
  drawBars();
  animationFrameId = requestAnimationFrame(animate);
}

function update() {
  const dataArray = new Uint8Array(analyser.frequencyBinCount);
  analyser.getByteFrequencyData(dataArray);
  // Processing of dataArray...
  updateFrequencyData(normalizedData);
  requestAnimationFrame(update);
}

Following is my WebVideoCreator capturing setup:

// Import WebVideoCreator and necessary components
import WebVideoCreator, { VIDEO_ENCODER, core, AUDIO_ENCODER } from 'web-video-creator';

const { ResourcePool } = core;

// Create a new instance of WebVideoCreator
const wvc = new WebVideoCreator();

// Global configuration
wvc.config({
    mp4Encoder: VIDEO_ENCODER.CPU.H264, // Choose the encoder based on your hardware
    browserUseGPU: true,                // Enable GPU acceleration if available
    frameFormat: "jpeg"                // Use jpeg for frame capture
});

// URL of your animation page with query parameters
const animationURL = '';

// Create a new ResourcePool with browser options
const resourcePool = new ResourcePool({
    numBrowserMin: 1,
    numBrowserMax: 5,
    browserOptions: {
        args: ["--autoplay-policy=no-user-gesture-required"], // Add Chrome flags here
        useGPU: true,
    },
});

// Create a single screen video with custom resource pool
const video = wvc.createSingleVideo({
    url: animationURL,
    width: 1920,    // Video width
    height: 1080,   // Video height
    fps: 60,        // Frame rate
    duration: 25000, // Duration in milliseconds (adjust based on audio length)
    outputPath: './recordings/25secwvc60fps2.mp4', // Output file path
    showProgress: true, // Show progress in the CLI
    resourcePool: resourcePool, // Use the custom resource pool
});

// Start rendering the video
video.start();

Recordings

"Normal Screen Recording" gives the expected result when capturing the animation and represents the animation when looking at it in the browser window.

Normal Screen Recording:

https://github.com/Vinlic/WebVideoCreator/assets/48747427/e6e56a33-48d2-4af2-8e9b-b7d00eda6675

At 60 fps the animation in the recorded video runs too fast when combining with the audio:

WebVideoCreator 60 fps:

https://github.com/Vinlic/WebVideoCreator/assets/48747427/e4bdfff4-2ced-4d26-9b5f-4cdfff906606

At 30 fps the animation in the recorded video runs too slow when combining with the audio:

WebVideoCreator 30 fps:

https://github.com/Vinlic/WebVideoCreator/assets/48747427/d33811d6-d3de-4fec-99be-5919db2637a7

Vinlic commented 1 year ago

@Br1el Hey, I noticed that you used requestAnimationFrame but did not handle the currentTime provided in the callback. The WVC recorded content should be an animation on a timeline, while the audio input synthesizer is a real-time animation creation process. If you want to accurately capture it, please first pre organize the PWM amplitude of the audio and play it back using anime.js or other animation libraries that support timelines. :D

Br1el commented 1 year ago

Thanks for the response! Will try to adjust my code accordingly and let you know if it worked out for me. :)

Vinlic commented 12 months ago

I will close this issue and you can reopen it at any time if you have other questions. 😸