Closed StarPlugins closed 8 years ago
Actually, after messing with WebVR I though that I should just remove CardboardEffect.js
as this is something the browser itself will handle eventually.
I've not experienced frame dropping issues with this on S3 when cardboard is turned on but in Firefox. I believe its as good as it gets for a pseudo function. On S3 nothing is rendering in chrome production anyway and will drop frames in Chrome Dev. It has nothing to do with webgl and simply playing back video.
I believe it should stay as a fallback option. However if you flag WebVR on in the browser it doesn't add distortion for cardboard its just stereo like the cardboard effect it acts like the stereoeffect ?? I believe the glasses are the ones that add distortion.
Here is the funky part. WebGL is broken because GearVR launches its own app when the glasses are put on not the WebVR api in the browser. WebGL is broken in that no doubt because its using stock Android browser which does not work at all. Only Firefox and partially Chrome does. Chrome is only working in S7 as far as Im aware and for some reason any of the Nexus ones but who owns Nexus phone really, not many. Samsung should be tested but Chrome people don't bother with that.
All this is still a work in progress...
I'm removing CardboardEffect
because it's better to focus the maintaining efforts to what's going to be used the most.
Anyone is free to pick up CardboardEffect
from the history and create (and maintain) a repo for it.
OK. May I ask how the VREffect is supposed to deal with cardboard distortion ? The default WebVR device on android is actually called cardboard, so its using orientation apis etc but rendering stereoscopic with no distortion.
@toji any plans on adding Cardboard distortion to Android WebVR soon?
The webvr polyfill now does distortion for you, we don't need to wait for the browser.
The distortion is also really performant now
Native distortion in Android may be a little ways out for silly technical reason. I think an intermediate step may be to have a native implementations that provides the Pose data and allow the polyfill to add distortion to it. Eventually we will get around native distortion, though.
On Fri, Mar 25, 2016, 4:45 PM Ash Connell notifications@github.com wrote:
The webvr polyfill now does distortion for you, we don't need to wait for the browser.
The distortion is also really performant now
— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub https://github.com/mrdoob/three.js/issues/8400#issuecomment-201561219
Ok I get you send the data to a polyfill for fallback distortion ie getProjectionMatrixLeftEye no worries. An actual webvr fallback would use the orientation api's which firefox has a fix for that but in nightly so only chrome is properly working there. But then there is an obvious problem with Chrome working on older Samsung devices like the S3 whereas Firefox works !
Possibly not a priority however I'm sorry to have mentioned when using Gear VR I have confirmed it will launch an app replacing chrome that doesn't even function with WebGL / WebVR / CORS no doubt just like the stock browser.
So a bit useless and short sighted on their behalf. No idea about Nexus. I will test properly when I get access to the S7 properly for debugging.
Hallo all,
I would like to add a comment on the issue. webVR polyfill works create and for most is super easy to use!
However, preformance on smartphones is still a problem. Having a simple panoWalkthrough tool with a set of cubeMaps preformance trops from a constant 60fps to about 30fps on my OnePlusOne or quality of the image is really bad and 30fps is a no go for VR.
I implemented the vertex displacment solution mentioned here in a custom shader and I get a neat 60fps. This actually isn't a lot of code and it would be realy nice to have support in all materials but I can imagin that it might break a lot of things especilay when it comes to light.
I don't know maybe it would be worth having a look into?
Best Richard
Hi wrong thread. I am assuming you are talking about putting a cardboard distorter back into the project it was removed. I ported it to another rep here
https://github.com/danrossi/three-vr-cardboardeffect
The distorter code has been removed from the vrview project. Any idea where that could be found and I could look at implementing into this project ? I've seen no update in terms of native cardboard support in webvr in Android yet so the Cardboard mode is still required.
No I wasn't talking about the cardboard effect (doesn't work performance / image quality wise) I am talking about adding something like a vertexVRshader chunk snipped to all materials...
It simply outperforms the Cardboard Effect / WebVR-polyfill even so it needs a lot of additional vertices at least when it comes to cube maps and simple geometry bits. Haven't tested it on complex scenes.
Please find attached my (humble) implementation.... I merged the code from the vrview into the meshbasic shader...
That is great. Yes I am using ShaderMaterial now also for 360 video, it certainly removes bloat to using the built in shader. See this killer thread
https://github.com/mrdoob/three.js/issues/9754#issuecomment-273483885
So this is a barrel distorter ? Could this be applied to standard stereocamera then ?
I might play around with it. Because the one integrated into the webvr polyfill is not modular and can't be used externally.
Yes I am using it together with the stereo effect also using it on a video shaders and facebooks video cubemapper, that you can get for ffmpeg... some problems with IPhone.
example: http://www.deepinterface.com/demo/360XT/CURRENT/Spielplatz/
you have to open it on a phone to get the stereo mode...
The question is if it could be implemented into all three.js materials? And also support additional devices...
The question is if it could be implemented into all three.js materials? And also support additional devices...
I think a place to start would be by creating a webvr_vertexbarrel.html
example with the custom shader. After that we could see how feasible would it be to move it to the materials. Would you like to do a PR?
I'm about to have a look to see how it looks on my end. Back as an example might be good.
Can do the example, don't have any ideas about developing on platforms such as GitHub...
I guess PR means pullRequest, but a little help would be nice.
EDIT: Will use one of the buffer geometry example...
I just implemented it on the StereoEffect example. That example is a bit complicated and its only affecting the bubbles. I can see distortion. Its not affecting the image.
I will try on the video example which is what I need it for.
The code needed a bit of cleaning up to make it work but it's doing something.
@schaf82 fork the project. checkout the dev branch which I think is default, add the example, then commit back to your fork. Then do a pull request on your github fork page. I can do it for you if you wish just need an example. I'm looking to see if it actually works first.
@danrossi
Guessing the background is a simple cube-> only corner vertices to distort -> no effect. This is the downside.
This might also help: https://github.com/mrdoob/three.js/wiki/How-to-contribute-to-three.js
I just tried an implementation like this because using the basic fragment shader wasn't displaying at all. No distortion yet. This is a slight modification to the shader I use. I was testing against the video rectangluar example with / without stereo effect.
var uniforms = THREE.UniformsUtils.merge([
THREE.UniformsLib.common,
THREE.UniformsLib.aomap,
THREE.UniformsLib.fog, {
uDistortionCoefficients: {
type: 'fv1',
value: [0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0]
},
uDistortionMaxFovSquared: {
type: 'f',
value: getDistortionMaxFovSquared(60,1)
},
uDistortionFovScale: {
type: 'v2',
value: getDistortionFovScale(60,1)
},
uDistortionFovOffset: {
type: 'v2',
value: getDistortionFovOffset(60,1)
}
}
]);
var vertexShader = [
getDistortionInclude(),
"varying vec2 vUV;",
"void main() {",
" vUV = vec2( uv.x, uv.y);", // fipY: 1.0 - uv.y
" vec4 pos_ = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
//" gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
" gl_Position = Distort(pos_);",
"}"
].join( "\n" );
var fragmentShader = [
"uniform sampler2D texture;",
"varying vec2 vUV;",
"void main() {",
" gl_FragColor = texture2D( texture, vUV );",
"}"
].join( "\n" );
uniforms.texture = { value: texture };
var material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader,
name: "cubeShader"
});
material._needsUpdate = true;
updateUniforms(material, "CARDBOARD II", 16/9);
Created an example, runs with 45fps (40fps on fullscreen) without lighting compared to 60fps without lighting.
http://www.planund.com/_samples/webglVR_buffergeometry/index.html
http://www.planund.com/_samples/webglVR_buffergeometry/index2.html
I guess some work needs to be done but I think it is a promising start.
It doesn't work against video textures. Just grey output. But then again setting up a shadermaterial with the basic shader does the same.
I tried applying a texture like so and no output.
material.uniforms.map = {type: 't', value: texture };
Yes I know this is a problem, I don't know why however you have to set the map in the unifroms and at the level of the material, this is why there was this bit of code:
mat.map = ""; // ensure that there is the map parameter at material level not sure if needed... mat.setValues(params); // add the params to the material level mat._needsUpdate = true; // not sure but as this only runs once update it... mat.uniforms.map = {type: 't', value: params.map} // add the map at uniform level updateUniforms(mat,type,aspect); // update distortion type... return mat;
it seams a bit complicated but for some reason this is working.
I will create a simple pano example in the next days also including video... And will try to make a pullrequest
This basic vertexshader should have worked. If I use my custom ones than trying to include the basic shaders it works. But I see no visible distortion.
var vertexShader = [
getDistortionInclude(),
"varying vec2 vUV;",
"void main() {",
" vUV = vec2( uv.x, uv.y);", // fipY: 1.0 - uv.y
" vec4 pos_ = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
//" gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
" gl_Position = Distort(pos_);",
"}"
].join( "\n" );
I now get something working with this
var material = new THREE.cubeShader("CardboardII", camera.aspect/2, {
side: THREE.DoubleSide, vertexColors: THREE.VertexColors, map: texture
});
mat.map = "";
mat.setValues(params);
mat.uniforms.map = {type: 't', value: params.map };
It's like my custom shader. Should I be seeing barrel distortion on the sides ?
example? Should look like this:
Undistorted:
CARDBOARDII distortion
These are textures mapped to a cube with 20*20 vertices / side...
From my understanding, the Three.js Google Cardboard distortion effect (CardboardEffect.js) works by rendering to a render-target texture, then using that texture applied to a simple distorted plane geometry.
I have noticed a significant drop in frame-rate when I use the effect, even on relatively high-performance phones like the LG G3. I presume this is because the pixels are effectively being drawn twice - once to the render target, and then again to the screen buffer via the plane mesh.
I guess this is aggravated by the high-density screens common in high-end phones.
Unfortunately, dropping the FPS like that kills the VR experience, leaving the current Cardboard effect only suitable for very simple scenes.
I've looked into this, and found recent reference to a technique that involves using vertex shaders to distort the scene geometry at render time, thus eliminating the render to texture part (that saves 3,686,400 pixel copies on my phone, and I don't have anywhere near that number of vertices).
https://ustwo.com/blog/vr-distortion-correction-using-vertex-displacement
Is this something that could be implemented in Three.js to help VR performance? Is it possible to 'plug in' a vertex shader that does the distortion in the current pipeline of shaders?
Unfortunately, this kind of thing is outside of my knowledge currently.