Closed DJviolin closed 6 years ago
i think you are hitting a bug i got in the projection matrix in jsartoolkit :(
aka the near/far of jsartoolkit are too far apart and thus create a overwhelming z-fighting
I tried setting logarithmicDepthBuffer: true
, but in this case, the model not appears if I enabled AR.js, just only it's shadow. In this three.js example, this setting fixing similar dimming.
This model is a real sized building (200-300m long) converted to gltf 2.0 and rendered as the 0.04 of the original size with the code. What you saying I should resize my model to the final projection size in sketchup/blender etc., which is ~0.5meter?
good thinking! logarithmicDepthBuffer should help... i dont understand it doesnt.
the dimming you see is called z-fighting and logarithmicDepthBuffer have been specifically designed for that...
You use three.js binding right ? logarithmicDepthBuffer is a property of webglrenderer ?
I tried calling like this:
var renderer = new THREE.WebGLRenderer({
antialias: true,
alpha: true,
logarithmicDepthBuffer: true, // logarithmic z-buffer
});
And the result is this, if I enable jsartoolkit
and AR.js
:
Without jsartookit
and AR.js
, it's the same as my first example. No dimming, no triangle edges.
@jbaicoianu are we missing something ? What do you think ?
I can provide this test project in email if you interested.
EDIT: I sent to your email.
@jeromeetienne logarithmic depth buffer might not be the best solution here - it's more for handling cases where you have huge objects off in the distance, but still want to render close-up objects accurately as well. It does this by trading off some precision for close-up stuff, in favor of increased range. It also has some weird interaction with shadow maps, which I haven't been able to wrap my head around - it's been an outstanding issue for years now, I need help from someone who knows more about shadows than I do to fix it :(
All that being said, I think the solution here is that AR.js
should adapt its camera parameters to the "size" of the scene. You can get a sense of the general size of the scene by looking at the bounding spheres and positions of every object within it. The camera's near plane is set either to the distance to the closest object minus its boundingSphere.radius
or some minimum value like .01 (in case the camera is inside the bounding sphere of an object), and the far plane is set to the distance to the furthest object plus its boundingSphere.radius
. Of course you must account for position + radius while sorting, to account for objects which are closer but have larger radius, like terrain.
This calculation can be done once at init for relatively static scenes, or if the scene is dynamic then it may be necessary to do this every frame, or every several frames. For static scenes, just make sure you account for everywhere the camera might be able to move, so you don't start clipping as soon as the camera moves back.
Actually just realized that this is really more of a problem with your shadowmap parameters than with your camera's projection matrix. It's the same basic problem, but the shadowmap camera is even more sensitive to the range between near and far planes.
You can probably fix this by playing with the shadow parameters on your light, specifically:
light.shadow.camera.near
light.shadow.camera.far
light.shadow.bias
light.shadow.mapSize
My current light settings are:
var light = new THREE.SpotLight(0xffeedd);
light.position.set(3.5, 10, 1);
light.castShadow = true; // default false
light.shadow.mapSize.width = 2048; // default: 512
light.shadow.mapSize.height = 2048; // default: 512
light.shadow.camera.near = 0.5; // default: 0.5
light.shadow.camera.far = 500; // default: 500
light.shadow.camera.fov = 90;
//light.intensity = 1;
scene.add(light);
This is same for the three.js setup also and in that case, there's no dimming.
@DJviolin try setting the light.shadow.bias
to something like .01 or .02. Shadows are very tricky to do in a general way, you often have to tweak them for the specific scene to eliminate things like this. What you're seeing is polygon self-shading, eg, at the edges of the polygon it thinks that the polygon is casting a shadow onto itself. You get the weird stipple/z-fighting look because of floating point inaccuracies.
The bias parameter lets you manually pull the shadows out away from the geometry to prevent this from happening. This can lead to some "gapping" if the value is too high, where the shadow doesn't connect to the object that's casting it, but with very small values it helps to overcome the floating point inaccuracies which cause self-shading.
As for why it works with the same params in plain three.js vs ar.js..well, it's a good question, the main difference is that you're using the camera that ar.js sets up with the default params rather than your own, which again brings us back to @jeromeetienne's suggestion that it might be his default near/far plane for the PerspectiveCamera. So maybe there is some interaction between the shadowmap parameters and whatever camera is being used to render the scene which I don't understand - like I said, I'm a little unclear about the underlying implementation details, I'm certain there's something like this that I'm missing because it seems like the same problem I have with logarithmicDepthBuffer
just note: there is a actual problem with the rendering camera. i experience z-figthing on very simple scene. i dunno how it affects this particular problem tho :)
Now i leave you discuss on this. @jbaicoianu thanks a bunch for your help :)
I tried setting light.shadow.bias
to 0.1 and 0.2, uneffective. I even setted extra small and big numbers like 0.000001 and 100000, also no effect.
Thinking more about this, I uncommented the spotlight, only enabling the ambient light to effect the scene:
var ambient = new THREE.AmbientLight(0x222222);
ambient.intensity = 5;
scene.add(ambient);
Which is still causing the issue:
Setting ambient.bias = 0.1;
or any other little or big value also uneffective.
Oh, sorry, I think I misunderstood which part you were looking at and saying looked bad. In your second shot, the part that jumped out at me first was the saw-tooth shadows that showed up on the ramp leading up to the building. But now compared to the third shot yes, I see that all around the roof line is much cleaner.
This might just be a matter of the default ar.js parameters causing z-fighting after all. It should be easy enough to experiment with changing these if AR.js exposes its camera object - just remember to call camera.updateProjectionMatrix()
after changing them.
I don't know exactly, where to put this camera.updateProjectionMatrix()
call. @jeromeetienne Any thoughts about this? Is there some kind of debugging for the camera object? I'm logging these now:
console.log(camera);
console.log(arToolkitContext);
But I'm not exactly sure what to change from the results.
in AR.js the projection matrix is quite specific. it is issued from qualibration and not from the usual camera.updateProjectionMatrix()
, if you call this function you would destroy the actual projection matrix.
this projection matrix has to match the characteristic of the actual physical webcam you are using
the camera to modify is the THREE.Camera, that is used for rendering
I logged the camera object when AR.js is on or off. I changed the AR.js code to PerspectiveCamera
also:
THREE.PerspectiveCamera: {
"metadata": {
"version": 4.5,
"type": "Object",
"generator": "Object3D.toJSON"
},
"object": {
"uuid": "996DFAEB-C116-4019-917A-3ED5E6450550",
"type": "PerspectiveCamera",
"matrix": [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1
],
"fov": 75,
"zoom": 1,
"near": 0.1,
"far": 1000,
"focus": 10,
"aspect": 1.5031315240083507,
"filmGauge": 35,
"filmOffset": 0
}
}
THREE.PerspectiveCamera: {
"metadata": {
"version": 4.5,
"type": "Object",
"generator": "Object3D.toJSON"
},
"object": {
"uuid": "DAAEF412-5466-498C-B86D-75A7F98439B2",
"type": "PerspectiveCamera",
"visible": false,
"matrix": [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1
],
"fov": 75,
"zoom": 1,
"near": 0.1,
"far": 1000,
"focus": 10,
"aspect": 1.5031315240083507,
"filmGauge": 35,
"filmOffset": 0
}
}
The only difference is the "visible": false
.
If I examine the pure objects in the console without JSON.stringify
, these are the differences:
matrixAutoUpdate: true // three.js
matrixAutoUpdate: false // three.js + jsartoolkit5 + AR.js
parent: null // three.js
parent: od // three.js + jsartoolkit5 + AR.js
And the uuid
of course.
The parent: od
element with AR.js looks like this with JSON.stringify
:
THREE.PerspectiveCamera.parent: {
"metadata": {
"version": 4.5,
"type": "Object",
"generator": "Object3D.toJSON"
},
"object": {
"uuid": "6826498C-AAA5-47F8-AA33-2E8A6B05343B",
"type": "Scene",
"visible": false,
"matrix": [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1
],
"children": [
{
"uuid": "B20ACB54-5BBF-40B5-9E87-9B4A1DB45CDE",
"type": "AmbientLight",
"matrix": [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1
],
"color": 2236962,
"intensity": 5
},
{
"uuid": "83A8FEFD-15C1-4157-8E08-51B0645B50E9",
"type": "PerspectiveCamera",
"visible": false,
"matrix": [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1
],
"fov": 75,
"zoom": 1,
"near": 0.1,
"far": 1000,
"focus": 10,
"aspect": 1.5031315240083507,
"filmGauge": 35,
"filmOffset": 0
}
]
}
}
I think these logs are not helping, but maybe useful.
I tried turning off the AmbientLight
too, so there's no any light source on the scene, but the dimming still remains.
Just confirming that resizing the model to final projection size in Sketchup doesn't change anything.
Because the original gltf examples not having this error, it could be an error with the model conversion workflow? I'm using Sketchup + collada2gltf
2.0. I tried the Blender route, but than I don't have textures. I'm trying another formats like obj.
.obj
files are also affected by dimming.
var mtlLoader = new THREE.MTLLoader();
mtlLoader.setPath('./models/obj/');
mtlLoader.setCrossOrigin('anonymous');
mtlLoader.load('model.mtl', function (materials) {
materials.preload();
var objLoader = new THREE.OBJLoader();
objLoader.setMaterials(materials);
objLoader.setPath('./models/obj/');
objLoader.load('model.obj', function (object) {
object.position.y = 0.7;
scene.add(object);
onRenderFcts.push(function () {
object.rotation.y += 0.01;
});
});
});
Because the textures are not power of two and Three.js doing the conversion, can it cause this z-fighting issue?
if my understanding is correct, the zfighting issue is a matter of how large is the 3d schene compared to how large is the distance between near/far in the rendering camera.
in fact, thinking out loud. maybe to make the object bigger on the scene, will spread the Z values more evenly between camera near/far and thus workaround this issue...
This is a pure three.js + jsartoolkit5 implementation and the issue still remains, so the error is NOT caused by AR.js. Maybe it's time to open an issue on jsartoolkit5 (if there's no already about z-fighting)?
<html>
<head>
<title>Pattern marker example with Three.js</title>
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1">
<style>
html,body { margin: 0; padding: 0; width: 100%; text-align: center; overflow-x: hidden; }
.portrait canvas { transform-origin: 0 0; transform: rotate(-90deg) translateX(-100%); }
.desktop canvas { transform: scale(-1, 1); }
</style>
</head>
<body>
<script defer src='//rawcdn.githack.com/artoolkit/jsartoolkit5/77733182a4c519b8e683cbf246a22920d94f3deb/build/artoolkit.min.js'></script>
<script defer src='//rawcdn.githack.com/mrdoob/three.js/r84/build/three.min.js'></script>
<script defer src='//rawcdn.githack.com/artoolkit/jsartoolkit5/77733182a4c519b8e683cbf246a22920d94f3deb/js/artoolkit.three.js'></script>
<script defer src='//rawcdn.githack.com/mrdoob/three.js/288709543605a598a99e45a9c9bc1c388e0df76e/examples/js/loaders/GLTF2Loader.js'></script>
<script defer>
window.ARThreeOnLoad = function() {
ARController.getUserMediaThreeScene({maxARVideoSize: 320, cameraParam: 'data-jsartoolkit5/camera_para.dat',
onSuccess: function(arScene, arController, arCamera) {
var scene = new THREE.Scene();
var renderer = new THREE.WebGLRenderer({
antialias: true,
});
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
var ambient = new THREE.AmbientLight(0x222222);
ambient.intensity = 5;
arScene.scene.add(ambient);
// Instantiate a loader
var loader = new THREE.GLTF2Loader();
// Load a glTF resource
var url = './models/model.gltf';
loader.load(url, function (gltf) {
sphere = gltf.scene || gltf.scenes[0];
sphere.rotation.x = 90 * (Math.PI / 180);
sphere.rotation.y = 270 * (Math.PI / 180);
sphere.rotation.z = 0 * (Math.PI / 180);
arController.loadMarker('data-jsartoolkit5/patt.hiro', function(markerId) {
var markerRoot = arController.createThreeMarker(markerId);
markerRoot.add(sphere);
arScene.scene.add(markerRoot);
});
});
var tick = function() {
arScene.process();
arScene.renderOn(renderer);
requestAnimationFrame(tick);
};
tick();
}
});
delete window.ARThreeOnLoad;
};
if (window.ARController && ARController.getUserMediaThreeScene) {
ARThreeOnLoad();
}
</script>
</body>
</html>
Update:
I had to change the camera to PerspectiveCamera
able to get it to work with logarithmicDepthBuffer: true
. Tested on Android phone also. Now the model showing up as it should. With Three.js
release 88, some shading/lighting effect also kicks in and now the gltf exported model from Sketchup have the same look just like in Mixed Reality Viewer
on Windows.
const camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 1, 1000);
However I still see some dimming on the edge lines only in motion, but luckily not on the textures anymore. Perhaps something still connected with z-fighting issue?
Strange, but the anti-aliasing turned off / reduced on some edges, but not on the whole model. This is the previous example without logarithmicDepthBuffer: true
and PerspectiveCamera
on r87, with proper anti-aliasing. The previous example had worse AA than that:
Seems like turning off antialias
and upscaling the model I can control the AA, not blurring the model and I can find the best option between quality/performance for the minimum hardware that I want to run on:
const upscale = 2; // render the model at 200%
renderer.setSize(window.innerWidth * upscale, window.innerHeight * upscale);
Those strange edge dimming still remains in motion.
I have the same problem! I assume it's because of Z-Fighting issue. To mitigate this problem, it's suggested to use logarithmicDepthBuffer. See example ( https://threejs.org/examples/webgl_camera_logarithmicdepthbuffer.html )
I couldn't try it yet, since I don't know how to access and change the properties of THREE.WebGLRenderer. Does anybody know how to set logarithmicDepthBuffer to true inside of A-Frame with AR.js?
@ab0zkurt It's kind of hacky, but I had to modify the aframe component to enable logarithmicdepthbuffer. I used the code in https://github.com/aframevr/aframe/issues/926 to create a new renderer component and then just added logarithmicDepthBuffer: true.
AFRAME.registerComponent('renderer', {
init: function () {
var antialias = this.el.getAttribute('antialias') === 'true';
this.el.renderer = new THREE.WebGLRenderer({ canvas: this.el.canvas, antialias: antialias || window.hasNativeWebVRImplementation, logarithmicDepthBuffer: true, alpha: true, preserveDrawingBuffer: true });
this.el.renderer.setPixelRatio(window.devicePixelRatio);
this.el.renderer.sortObjects = false;
this.el.effect = new THREE.VREffect(this.el.renderer);
},
remove: function () {
var antialias = this.el.getAttribute('antialias') === 'true';
this.el.renderer = new THREE.WebGLRenderer({ canvas: this.el.canvas, antialias: antialias || window.hasNativeWebVRImplementation, logarithmicDepthBuffer: true, alpha: true, preserveDrawingBuffer: false });
}
});
And then attach it to the scene
<a-scene embedded arjs renderer='antialias: true;'>
@Tino-F I've solved this problem already by doing exactly the same thing you suggested. Thanks anyway for your help!
Using directly AR.js (not A-Frame) for pattern markers recognition & "binding" a GLTF 3D model to a marker (starting from this example), setting logarithmicDepthBuffer: true
when doing new THREE.WebGLRenderer({...})
solved the problem for me.
Get near/far plane from projection matrix: see https://forums.structure.io/t/near-far-value-from-projection-matrix/3757
// only to get near/far planes, use set code
arToolkitContext.init(function onCompleted(){
let m = arToolkitContext.getProjectionMatrix();
let m22 = m.elements[10];
let m32 = m.elements[14];
let near = (2.0 * m32) / (2.0 * m22 - 2.0);
let far = ((m22 - 1.0) * near) / (m22 + 1.0);
console.log(near); // 0.0001
console.log(far); // 1000.0000006062936
})
so the default near plane is 0.0001, way to small
To solve this problem you can set new near/far panes:
// paste this code in your arToolkitContext.init function
arToolkitContext.init(function onCompleted(){
let m = arToolkitContext.getProjectionMatrix();
let far = 1000;
let near = 0.1;
m.elements[10] = -(far + near) / (far - near);
m.elements[14] = -(2 * far * near) / (far - near);
camera.projectionMatrix.copy(m);
})
That solved the z-fighting bug for me
Simply adding renderer="logarithmicDepthBuffer:true;"
to <a-scene>
did the trick for me.
<a-scene renderer="logarithmicDepthBuffer:true;">
I didn't have the need to register my own "renderer" component.
Sorry to necro, @calvinclaus I am having this issue and am using aframe 0.9.2 and ARjs 1.7.7. I see you solved your issue using logarithmicDepthBuffer:true, but if I use that any text that was clipping/having z fighting issues no longer renders if in front of another object.
Please see my glitch here for example: https://glitch.com/edit/#!/bbc-legacy-stretch
logarithmicDepthBuffer
you are The Master!
I have the following scenerio, when I only use three.js, in this case, this code is uncommented:
Everything is fine, lines are smooth, there's no any triangle jagged edges.
I'm activating AR.js by commenting out the upper portion of the code and placing this in place:
Now, this is what I got:
Those vibrating triangle texture rendering errors makes the model un-enjoyable. Without AR.js I have stable 60FPS, with AR.js ~40FPS.
My HTML file looks like this: