eduardolundgren / tracking.js

A modern approach for Computer Vision on the web
http://trackingjs.com
Other
9.44k stars 1.45k forks source link

web app deployment #192

Open lucasroth opened 7 years ago

lucasroth commented 7 years ago

Hi, I have some good experience using AR SDK, but not to much web development. I am stucked on how to deploy an existing example to a webserver. I tested uploading the whole tracking.js-master to a https server, and the examples work ok via internet. But can´t do the same for a new .html file done just by copying and pasting existing .html example. Please can you detail or give a hint on how to wrap the web app correctly.

AnthyG commented 7 years ago

You just need to include the tracking.js file (in the directory you downloaded, it should be located in /build). So first of all, you need to add a video-element and a canvas-element to your html:

<div id="C">
    <video id="video" width="800" height="600" preload autoplay loop muted></video>
    <canvas id="canvas" width="800" height="600"></canvas>
</div>

Here's some CSS to get a mirrored video:

#C {
    width: 800px;
    height: 600px;
    position: relative;
    margin: 0 auto;
}

/* Hide your video-element with this */
#video{
    /*visibility: hidden;*/
}

#canvas {
    position: absolute;
    top: 0;
}

/* 'Mirror' video and canvas-elements */
#video,
#canvas {
    -moz-transform: scale(-1, 1);
    -o-transform: scale(-1, 1);
    -webkit-transform: scale(-1, 1);
    transform: scale(-1, 1);
    filter: FlipH;
}

Then you have to create a context for that in javascript as follows:

var canvas = document.getElementById("canvas");
var ctx = canvas.getContext('2d');

This should be called, once the dom is loaded, so just put it into a script-tag right before </body>.

Next you have to create a tracker, for this example, I'll just use the ColorTracker already provided with tracking.js:

// Only track the color magenta
var colors = new tracking.ColorTracker(['magenta']);
colors.on('track', function(event) {
    // Here is some sample code to draw dots onto the canvas
    event.data.forEach(function(rect) {
        var x = rect.x + rect.width / 2, // This will get the horizontal center of the rect
            y = rect.y + rect.height / 2, // .. and the vertical
            w = rect.width,
            h = rect.height,
            c = rect.color;

        ctx.beginPath();
        ctx.arc(x, y, 50 / w + 50 / h, 0, 2 * Math.PI, false);
        ctx.fillStyle = c;
        ctx.fill();
    });
}
tracking.track('#video', colors, {
    camera: true
});

Instead of drawing a circle, you could draw anything you want, here are two more examples:

oldx = x, oldy = y;



Also, you may just want to try this via localhost, if hosted on your own development machine (There's a npm-module called ['http-server'](https://www.npmjs.com/package/http-server) which does this just fine!)

Here's a link to the the [MDN-Site regarding canvas'](https://developer.mozilla.org/docs/Web/API/Canvas_API).
Hope I could help 😅 
lucasroth commented 7 years ago

Thanks!

Now, I got trouble about using 3D canvas. Also in that context, importing a 3D model (json from clara.io to three.js) you can not render 3D model "in a z roder forward" than the video background. Have you tested running the CV/tracking part in a server, and the rendering in the client front end (browser, js using three.js).

Best regards, great work you have done! :)

On Fri, Jan 6, 2017 at 5:49 PM, Anthony Garratt notifications@github.com wrote:

You just need to include the tracking.js file (in the directory you downloaded, it should be located in /build). So first of all, you need to add a video-element and a canvas-element to your html:

Here's some CSS to get a mirrored video:

C {

width: 800px;
height: 600px;
position: relative;
margin: 0 auto;

}

/ Hide your video-element with this /

video{

/*visibility: hidden;*/

}

canvas {

position: absolute;
top: 0;

}

/ 'Mirror' video and canvas-elements /

video,

canvas {

-moz-transform: scale(-1, 1);
-o-transform: scale(-1, 1);
-webkit-transform: scale(-1, 1);
transform: scale(-1, 1);
filter: FlipH;

}

Then you have to create a context for that in javascript as follows:

var canvas = document.getElementById("canvas"); var ctx = canvas.getContext('2d');

This should be called, once the dom is loaded, so just put it into a script-tag right before .

Next you have to create a tracker, for this example, I'll just use the ColorTracker already provided with tracking.js:

// Only track the color magenta var colors = new tracking.ColorTracker(['magenta']); colors.on('track', function(event) { // Here is some sample code to draw dots onto the canvas event.data.forEach(function(rect) { var x = rect.x + rect.width / 2, // This will get the horizontal center of the rect y = rect.y + rect.height / 2, // .. and the vertical w = rect.width, h = rect.height, c = rect.color;

    ctx.beginPath();
    ctx.arc(x, y, 50 / w + 50 / h, 0, 2 * Math.PI, false);
    ctx.fillStyle = c;
    ctx.fill();
});

} tracking.track('#video', colors, { camera: true });

Instead of drawing a circle, you could draw anything you want, here are two more examples:

  • Rectangle

ctx.strokeStyle = c; ctx.strokeRect(rect.x, rect.y, w, h);

  • Line (this requires to save the last coordinates..)

ctx.beginPath(); ctx.lineCap = "round"; ctx.lineJoin = "round"; ctx.moveTo(oldx, oldy); ctx.lineTo(x, y); ctx.strokeStyle = c; ctx.stroke();

oldx = x, oldy = y;

Here's a link to the the MDN-Site regarding canvas' https://developer.mozilla.org/docs/Web/API/Canvas_API. Hope I could help 😅

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/eduardolundgren/tracking.js/issues/192#issuecomment-271002306, or mute the thread https://github.com/notifications/unsubscribe-auth/AE7d9Xwn5x5yGMMeJZ9KrKHMoQxCmhsJks5rPqjcgaJpZM4LSUI4 .

-- Lucas Roth Ing Civil Industrial PUCV movil: 96347193

AnthyG commented 7 years ago

@lucasroth little bit hard to read, but if I understand correct, you want to use three.js to render a 3d-Object you got from clara.io? I don't really get, what you mean with 'in a z roder forward', sry 😞

But the part with tracking server-side and then rendering client-side is not very good, unless you have a really stable connection to the server, as you have to send all the data, the client get's from your camera, TO the server, do what ever you want there, and then send the result(s) BACK to the client...

I haven't really checked 3d-rendering with the canvas-element and js yet, and I've had limited success with three.js.

So you may just want to edit that part with 'z roder' and the 'video background' (Do you want to render the 3d-object bigger than the video or what 🤔)

I'll try to help as much as I can, but I won't really test myself with 3d-rendering just yet, though you may want to check out this issue #193

lucasroth commented 7 years ago

Hi, thanks for reply! sorry for my lack of proficiency explaining my project... I want to make Augmented Reality to be used on web browser (chrome or firefox), wheras you can see a 3D model (for example a Car) rendered in real time, tracked to a Target: marker or markerless or facetracking or whatever is feasible. So, I use three.js (webgl) to load and render the 3D object, and works fine by itself without live videocapture render and tracking. I am using the following steps to get the 3D model: export from Cinema 4D as an FBX with baked textures, then I use clara.io to export as a Json object, then with this format you can use it nice and easy (loading and rendering) using three.js. When I use Tracking.js: can easly get tracking info and render 2D objects (color camera example), but can not render in real time the 3D object in front of the videocapture (the idea is to get the 3D model rendered and tracked in real time). First thing I tried was to change the "Z order" atribute of the .css file, but realized that doesn´t work on 3D objects (canvas), using this property you can handle easly the 2D objects, but with 3D models nothing happens :( Attached is a pic of what happens: videocapture is OK, tracking is OK, but the 3D model is rendered "behind" it. As I studied about how to solve this, found an example using opencv running on a server, so videocapture is not rendered as a canvas, thus you can render 3D objects "in front" of the videocapture: https://www.smashingmagazine.com/2016/02/simple-augmented-reality-with-opencv-a-three-js/

What do you think can do for getting this to work? I think your work is great, would be glad to help in anything I can (as far as can be helpful).

Regards,

On Sat, Jan 7, 2017 at 8:56 AM, Anthony Garratt notifications@github.com wrote:

@lucasroth https://github.com/lucasroth little bit hard to read, but if I understand correct, you want to use three.js to render a 3d-Object you got from clara.io? I don't really get, what you mean with 'in a z roder forward', sry 😞

But the part with tracking server-side and then rendering client-side is not very good, unless you have a really stable connection to the server, as you have to send all the data, the client get's from your camera, TO the server, do what ever you want there, and then send the result(s) BACK to the client...

I haven't really checked 3d-rendering with the canvas-element and js yet, and I've had limited success with three.js.

So you may just want to edit that part with 'z roder' and the 'video background' (Do you want to render the 3d-object bigger than the video or what 🤔)

I'll try to help as much as I can, but I won't really test myself with 3d-rendering just yet, though you may want to check out this issue #193 https://github.com/eduardolundgren/tracking.js/issues/193

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/eduardolundgren/tracking.js/issues/192#issuecomment-271079405, or mute the thread https://github.com/notifications/unsubscribe-auth/AE7d9bb-OrFMADKBKO1P7cv7krORtIx2ks5rP32CgaJpZM4LSUI4 .

-- Lucas Roth Ing Civil Industrial PUCV movil: 96347193

AnthyG commented 7 years ago

But doesn't three.js also use canvas to render the objects? Maybe you just try and put the canvas-element 'underneath' the video-element, like this:

<div id="C">
    <video id="video" width="800" height="600" preload autoplay loop muted></video>
    <canvas id="canvas" width="800" height="600"></canvas>
</div>

I did actually try and set the z-index too, but it didn't work somehow 🤔

devotionsolutions commented 7 years ago

Hi @lucasroth , we are both actually doing the same, a web based augmented reality experience :D What I'm using is tracking.js + Babylon.js. And I'm adding the video from the webcam as a videotexture in Babylon's scene. This way, we are able to render the 3D object in front of the camera stream.. HTH

lucasroth commented 7 years ago

Yes, I have tried that also, but can´t call any library that can do tracking at the same time, so can render tracked to a Target :( Let´s stay in touch so we can sort out this as a team :D

On Thu, Jan 12, 2017 at 2:12 PM, Claudio M. E. Bastos Iorio < notifications@github.com> wrote:

Hi lucasroth, we are actually doing the same, a web based augmented reality experience :D What I'm using is tracking.js + Babylon.js. And I'm adding the video from the webcam as a videotexture in Babylon's scene. This way, we are able to render the 3D object in front of the camera stream.. HTH

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/eduardolundgren/tracking.js/issues/192#issuecomment-272222955, or mute the thread https://github.com/notifications/unsubscribe-auth/AE7d9SX_me3ejbnJMv7v0o7MrxUCjH-wks5rRl7ugaJpZM4LSUI4 .

-- Lucas Roth Ing Civil Industrial PUCV movil: 96347193

devotionsolutions commented 7 years ago

Hi @lucasroth, I'm not sure if I understand what's your problem now. But PM me and I'll try to help.

Thanks,