Closed lumierephoto closed 9 years ago
The article you linked explains how to stream video from a webcam behind a private ISP home connection to many users on the public internet. In short, a local PC with the webcam encodes /dev/video0 into MPEG, sends it via HTTP to a public webserver, which in turn distributes it to all connected browsers via WebSockets.
So, yes, you can use a VPS for the distribution part: install nodejs and use the stream-server.js
from this repo.
I see how the article is a bit confusing in this regard. I'll change it to explain the basic setup first.
I have included a modified "stream-server.js" in a node server serving a webpage derived from "stream-example.html" . It displays nicely the webcam stream inside the local network but the video is not shown on the intenet (whereas the rest of the webpage is). Any explanations for this?
Example? The stream-server.js
needs to run on a public server with its own ip address.
The server I use is just "stream-server.js" slightly modified with a "servi" node module:
if( process.argv.length < 3 ) {
console.log(
'Usage: \n' +
'node stream-server.js <secret> [<stream-port> <websocket-port>]'
);
process.exit();
}
var STREAM_SECRET = process.argv[2],
STREAM_PORT = process.argv[3] || 8082,
WEBSOCKET_PORT = process.argv[4] || 8084,
STREAM_MAGIC_BYTES = 'jsmp'; // Must be 4 bytes
var width = 320,
height = 240;
// Websocket Server
var socketServer = new (require('ws').Server)({port: WEBSOCKET_PORT});
socketServer.on('connection', function(socket) {
// Send magic bytes and video size to the newly connected socket
// struct { char magic[4]; unsigned short width, height;}
var streamHeader = new Buffer(8);
streamHeader.write(STREAM_MAGIC_BYTES);
streamHeader.writeUInt16BE(width, 4);
streamHeader.writeUInt16BE(height, 6);
socket.send(streamHeader, {binary:true});
console.log( 'New WebSocket Connection ('+socketServer.clients.length+' total)' );
socket.on('close', function(code, message){
console.log( 'Disconnected WebSocket ('+socketServer.clients.length+' total)' );
});
});
socketServer.broadcast = function(data, opts) {
for( var i in this.clients ) {
if (this.clients[i].readyState == 1) {
this.clients[i].send(data, opts);
}
else {
console.log( 'Error: Client ('+i+') not connected.' );
}
}
};
// HTTP Server to accept incomming MPEG Stream
var streamServer = require('http').createServer( function(request, response) {
var params = request.url.substr(1).split('/');
if( params[0] == STREAM_SECRET ) {
width = (params[1] || 320)|0;
height = (params[2] || 240)|0;
console.log(
'Stream Connected: ' + request.socket.remoteAddress +
':' + request.socket.remotePort + ' size: ' + width + 'x' + height
);
request.on('data', function(data){
socketServer.broadcast(data, {binary:true});
});
}
else {
console.log(
'Failed Stream Connection: '+ request.socket.remoteAddress +
request.socket.remotePort + ' - wrong secret.'
);
response.end();
}
}).listen(STREAM_PORT);
console.log('Listening for MPEG Stream on http://127.0.0.1:'+STREAM_PORT+'/<secret>/<width>/<height>');
console.log('Awaiting WebSocket connections on ws://127.0.0.1:'+WEBSOCKET_PORT+'/');
var servi = require('servi'), // include the servi library
app = new servi(false); // servi instance
// configure the server's behavior:
app.port(8080); // port number to run the server on
app.serveFiles("public"); // serve all static HTML files from /public
app.start();
console.log("Listening for new clients on port 8080");
Inside the "public" there is an html file similar to the "stream-example.html" to which I added a <h1> statement </h1>
for debugging:
<!DOCTYPE html>
<html>
<head>
<meta name="viewport" content="width=320, initial-scale=1"/>
<title>jsmpeg streaming webcam</title>
<style type="text/css">
body {
background: white;
text-align: center;
margin-top: 10%;
}
#videoCanvas {
/* Always stretch the canvas to 640x480, regardless of its
internal size. */
width: 640px;
height: 480px;
}
</style>
</head>
<body>
<h1>
The Canvas size specified
</h1>
<!-- The Canvas size specified here is the "initial" internal resolution. jsmpeg will
change this internal resolution to whatever the source provides. The size the
canvas is displayed on the website is dictated by the CSS style.
-->
<canvas id="videoCanvas" width="640" height="480">
<p>
Please use a browser that supports the Canvas Element, like
<a href="http://www.google.com/chrome">Chrome</a>,
<a href="http://www.mozilla.com/firefox/">Firefox</a>,
<a href="http://www.apple.com/safari/">Safari</a> or Internet Explorer 10
</p>
</canvas>
<script type="text/javascript" src="jsmpg.js"></script>
<script type="text/javascript">
// Show loading notice
var canvas = document.getElementById('videoCanvas');
var ctx = canvas.getContext('2d');
ctx.fillStyle = '#444';
ctx.fillText('Loading...', canvas.width/2-30, canvas.height/3);
// Setup the WebSocket connection and start the player
var client = new WebSocket( 'ws://192.168.1.15:8084/' );
var player = new jsmpeg(client, {canvas:canvas});
</script>
</body>
</html>
var client = new WebSocket( 'ws://192.168.1.15:8084/' );
This is a local address. You have to specify the public address of your node server here.
I am still struggling with jsmpeg on wlan. In the example you give on your blog (http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets) you use as an ip address "example.com" for ffmpeg and it is also used in stream-example.html. I do not uderstand very well what should be this "example.com". Can it be just my public address once stream-example.html is served by a node server running on localhost? Can you kindly clarify for me what is the nature of "example.com"? Thanks a lot.
If you configure your router to forward all requests from the internet on port 8084 to the local machine running the nodejs script, then yes, you can use your public ip address.
example.com
is your own public server. The typical setup is this:
PC/MAC with a webcam --> public server <-- client browser
Your machine with the webcam runs ffmpeg, encodes the webcam video as mpeg1 and sends it to the public server via http. The public server runs a nodejs script that accepts this http stream and acts as WebSocket server, accepts incoming WebSocket connections from browser and sending out the stream over these connections.
The intermediate step, with the public server distributing the video over WebSocket, was chosen for two reasons:
1) NAT-Traversal is a pain in the ass. It's difficult to have a server running on your local network that is accessible from the internet. (See Port Forwarding)
2) Personal Internet connections usually have very limited upstream bandwidth (in Europe it's mostly in the 2-5 mbits/s range for a connection that has 50-100 mbit/s downstream bandwidth). A public server usually has more upstream bandwidth and can distribute the video stream to a large number of clients if needed.
Edit: if you just want to use jsmpeg within your wifi network, not streaming it over the internet, you can of course run ffmpeg and the nodejs script on the same machine. jsmpeg would then connect to the local address of the machine running the nodejs script.
I open an instance on AWS EC2 on which i run stream-server.js, then I run ffmpeg on my local machine with the aws instance address for port 8082, then I open stream-example.html on browwser with the aws instance address for port 8084. I get some images which change at a very slow rate, like one every 30 seconds. It appears as if I had a very low upstream bandwith between my local machine and the aws instance. Any explanation for this behaviour?
Hello Dominic,
Im currently using your JSMpeg javascript library to live stream from a raspberry pi using one of your examples with ffmpeg encoding. It works perfectly fine but I do not understand how is it possible that the images captured on the raspberry pi to be encoded with x264 ffpmeg and decoded with JSMpeg MPEG-1. Could you please explain this in depth.
Thank you so much!
Regards, Alex
Can JSMPEG work for streaming my webcam if I spun up a remote VPS server (i.e. Digital Ocean)?
I'm a little confused about it's application and I assume that the example that you give is for doing it on a local machine.
I guess my question is: How would I send the webcam connection /dev/video0 or /dev/video1 to the remote server? Is there anything different I would need to do from your tutorial: http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets ?