node-webrtc / node-webrtc-examples

MediaStream and RTCDataChannel examples using node-webrtc
508 stars 161 forks source link

wrtc.MediaStream empty object - Trying to create decoder for unsupported format #43

Open joezappie opened 3 years ago

joezappie commented 3 years ago

I'm trying to get a stream from a nodejs server running on a headless raspberry pi, to an electron computer. I'm using simple-peerjs (simple-peer and peerjs combined) to setup the connection between the PI and electron app. When I start the connection, my electron app spits out:

[11336:0304/133406.852:ERROR:internal_decoder_factory.cc(59)] Trying to create decoder for unsupported format
[11336:0304/133406.859:ERROR:internal_decoder_factory.cc(59)] Trying to create decoder for unsupported format
[11336:0304/133406.867:ERROR:webrtc_video_engine.cc(3308)] Absent receive stream; ignoring clearing encoded frame sink for ssrc 0

Because of that I never get the 'stream' event because its an invalid stream I guess.

I've been following the video compositing example to just stream an RGB changing canvas. I pretty much copied it line for line, only changing it to fit with simple-peerjs connect, and changing it to loop through the rainbow for the entire canvas without text.

const { performance } = require('perf_hooks');
const { createCanvas, createImageData } = require('canvas');
const { RTCVideoSink, RTCVideoSource, i420ToRgba, rgbaToI420 } = require('wrtc').nonstandard;
const { hsv } = require('color-space');

// Create a new media stream
const source = new RTCVideoSource();
const track = source.createTrack();
const media = new wrtc.MediaStream([track]);

// On command from server, connect RPI to Electron and add the stream
socket.on('video.start', async (guid, cb) => {
  console.log("Start video");

  // Connect to the peer with guid and append the media stream
  const conn = await peer.connect(guid, {stream: media});
  console.log("Video Started!");
});

// Loop forever creating new frames
const width = 640;
const height = 480;
let hue = 0;

const canvas = createCanvas(width, height);
const context = canvas.getContext('2d');

const interval = setInterval(() => {
  hue = ++hue % 360;
  const [r, g, b] = hsv.rgb([hue, 100, 100]);

  context.fillStyle = `rgba(${r}, ${g}, ${b}, 1)`;
  context.fillRect(0, 0, width, height);

  const rgbaFrame = context.getImageData(0, 0, width, height);
  const i420Frame = {
    width,
    height,
    data: new Uint8ClampedArray(1.5 * width * height)
  };
  rgbaToI420(rgbaFrame, i420Frame);
  source.onFrame(i420Frame);
});

If I log media stream in my RPI server, it comes back as an empty object. I feel thats not right and is why its saying the stream is absent and an unsupported format:

console.log(media) // MediaStream {}

If I create a new mediastream in my electron app and log it it comes back with a bunch of functions and an id:

const media = new MediaStream();
console.log(media);

// MediaStream {id: "fc452b97-f03e-4532-8129-58f1c028ab65", active: false, onaddtrack: null, onremovetrack: null, onactive: null, …}

If I add a RTCVideoSink and get the on frame event, I do see that its changing and is a frame of 640 x 480. I think the RTCVideoSource stuff is fine, just the mediastream isn't working correctly.

Any ideas on how to fix this?