mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
102.58k stars 35.37k forks source link

LatitudeReflectionMapping #1621

Closed mrdoob closed 12 years ago

mrdoob commented 12 years ago

So seems like with @spite we figured out the shader code for this type of reflection :)

http://www.clicktorelease.com/code/latitudeReflectionMapping/ http://dl.dropbox.com/u/7508542/three.js/latitude/index.html http://dl.dropbox.com/u/7508542/three.js/latitude/refraction.html

vertexShader: [

    "varying vec3 vReflect;",

    "void main() {",
        "vec4 mPosition = objectMatrix * vec4( position, 1.0 );",
        "vec3 nWorld = normalize( mat3( objectMatrix[0].xyz, objectMatrix[1].xyz, objectMatrix[2].xyz ) * normal );",
        "vReflect = normalize( reflect( normalize( mPosition.xyz - cameraPosition ), nWorld ) );",
        "gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
    "}"

].join( '\n' ),

fragmentShader: [

    "uniform sampler2D tDiffuse;",
    "varying vec3 vReflect;",

    "void main(void) {",
        "float PI = 3.14159265358979323846264;",
        "float yaw = .5 - atan( vReflect.z, - vReflect.x ) / ( 2.0 * PI );",
        "float pitch = .5 - atan( vReflect.y, length( vReflect.xz ) ) / ( PI );",
        "vec3 color = texture2D( tDiffuse, vec2( yaw, pitch ) ).rgb;",
        "gl_FragColor = vec4( color, 1.0 );",
    "}"

].join( '\n' )

@alteredq should I try to implement?

alteredq commented 12 years ago

Looks cool ;).

What's the pipeline for producing images needed for that?

I think if it's something reasonable than it would be nice to have this as a part of a standard material system.

If it's something crazy and we would have it working just with this single image than it's better to leave it just as a custom ShaderMaterial example.

mrdoob commented 12 years ago

Well, Google's Street View uses this type of textures ;)

@paullewis did an article that used this projection as base for creating cube maps. http://aerotwist.com/tutorials/create-your-own-environment-maps/

Having this would save people having to do the second part of the article :P

alteredq commented 12 years ago

Oh, this is cool. Google Street View is enough to make it worthwhile ;).

paullewis commented 12 years ago

Concur! Anything that lets people do their own ghetto environment maps is a winner! :D

mrdoob commented 12 years ago

Using asin instead of atan for pitch (as per link).

vertexShader: [

    "varying vec3 vReflect;",

    "void main() {",
        "vec4 mPosition = objectMatrix * vec4( position, 1.0 );",
        "vec3 nWorld = normalize( mat3( objectMatrix[0].xyz, objectMatrix[1].xyz, objectMatrix[2].xyz ) * normal );",
        "vReflect = normalize( reflect( normalize( mPosition.xyz - cameraPosition ), nWorld ) );",
        "gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
    "}"

].join( '\n' ),

fragmentShader: [

    "uniform sampler2D tDiffuse;",
    "varying vec3 vReflect;",

    "void main(void) {",
        "float PI = 3.14159265358979323846264;",
        "float yaw = .5 - atan( vReflect.z, - vReflect.x ) / ( 2.0 * PI );",
        "float pitch = .5 - asin( vReflect.y ) / PI;",
        "vec3 color = texture2D( tDiffuse, vec2( yaw, pitch ) ).rgb;",
        "gl_FragColor = vec4( color, 1.0 );",
    "}"

].join( '\n' )
spite commented 12 years ago

The demo with reflection and refraction (with a bit of reflection) is here http://www.clicktorelease.com/code/streetViewReflectionMapping/

zz85 commented 12 years ago

very cool! :) is this the same projection used in http://notlion.github.com/streetview-stereographic/ ?

Are there also alternative terminology for this Latitude Reflection Mapping? I tried googling that and @spite's experiment is already on the front page :)

[edit] ok, realized i missed out reading article the linked by @mrdoob to http://www.reindelsoftware.com/Documents/Mapping/Mapping.html

mrdoob commented 12 years ago

Ah! I forgot about @notlion's one. Applied his code a bit. Nicer looking code:

vertexShader: [

    "varying vec3 vReflect;",

    "void main() {",
        "vec4 mPosition = objectMatrix * vec4( position, 1.0 );",
        "vec3 nWorld = normalize( mat3( objectMatrix[0].xyz, objectMatrix[1].xyz, objectMatrix[2].xyz ) * normal );",
        "vReflect = reflect( normalize( mPosition.xyz - cameraPosition ), nWorld );",
        "gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
    "}"

].join( '\n' ),

fragmentShader: [

    "uniform sampler2D tDiffuse;",
    "varying vec3 vReflect;",

    "#define PI 3.141592653589793",

    "void main(void) {",
        "float lon = atan( vReflect.z, - vReflect.x );",
        "float lat = asin( vReflect.y );",
        "vec3 color = texture2D( tDiffuse, 0.5 - vec2( lon, lat ) / vec2( PI * 2.0, PI ) ).rgb;",
        "gl_FragColor = vec4( color, 1.0 );",
    "}"

].join( '\n' )

POT issue is still there though...

mrdoob commented 12 years ago

Well, after struggling with this for a couple of nights. Turns out this is not possible because of the way graphic cards work. When enabling mipmapping we get seams at the edges of the texture.

The way to go is creating a WebGLRenderTargetCube out of the equirectangular texture:

http://dl.dropbox.com/u/7508542/three.js/latitude/cubemap.html

However, there seems to be an issue on ANGLE with WebGLRenderTargetCube + LinearMipMapLinearFilter, hopefully this gets fixed eventually.

http://dl.dropbox.com/u/7508542/three.js/latitude/cubemap_mipmap.html

So I'll just remove LatitudeReflectionMapping and LatitudeRefractionMapping from the lib.

alteredq commented 12 years ago

However, there seems to be an issue on ANGLE with WebGLRenderTargetCube + LinearMipMapLinearFilter, hopefully this gets fixed eventually.

I added fix for this here:

https://github.com/alteredq/three.js/commit/ada3a940b82d3f6215cc53ae443e866af76fe71a

Unfortunately this still needs one hack, you need to call cube camera update twice (otherwise you'll just get completely black cubemap):

cubeCamera.updateCubeMap( renderer, scene );
cubeCamera.updateCubeMap( renderer, scene );

I tried to debug this to no success :S.

Notably this trouble happens just for ANGLE, OpenGL is fine with just a single update.

mrdoob commented 12 years ago

Well, better than nothing :) Have you tried having some flat colored objects on the scene to see if they get rendered in the first update?

alteredq commented 12 years ago

As far as I understood, problem is not rendering of cubemap per se, problem is cubemap mipmap generation (that is, there were no problems when using just linear / nearest filters).

For some reason ANGLE / DirectX seem to be peculiar about when you can generate mipmaps.

Mipmap generated after rendering into each cube face resulted in one of the faces being black (and bad performance).

Mipmap generated just after the last cube face got rendered works, but you need to do the whole dance twice (render cube, generate mipmap, render cube, generate mipmap) for not getting completely black cubemap.

mrdoob commented 12 years ago

I see...

alteredq commented 12 years ago

And I got it working with single updateCubeMap ;)

https://github.com/alteredq/three.js/commit/367ee78804beca03b523901f2e95dc65a4b512d4

Good that we started to poke into this - as a side-effect we got performance improvement for cube targets and I think also the same thing was going on with single shot 2D render targets (where I had to do similar 2x update hack before).

mrdoob commented 12 years ago

Yay! \:D/

mrdoob commented 12 years ago

Uhm... I just found out some API weirdness...

I've been using this so far:

var material = new THREE.MeshBasicMaterial( { envMap: cubeCamera.renderTarget } );

http://dl.dropbox.com/u/7508542/three.js/latitude/cubemap_dynamic.html

But then I was wondering, how would this look if it was refracting instead. At that point I realised that renderTarget was not a Texture so it didn't have mapping. So I tried this:

var material = new THREE.MeshBasicMaterial( { envMap: new THREE.Texture( cubeCamera.renderTarget, new THREE.CubeRefractionMapping() ) } );

http://dl.dropbox.com/u/7508542/three.js/latitude/cubemap_dynamic_refraction.html

So seems like there is no way to use a rendertarget as refraction? I guess WebGLRenderer accepts Texture and WebGLRenderTarget in the map and envMap parameters... I wonder what's best... adding mapping to WebGLRenderTarget or only accepting WebGLRenderTarget as Texture.image?

alteredq commented 12 years ago

Adding mapping to WebGLRenderTarget is much simpler.

In fact it should work even now if you just stick the thing in your cube target object:

cubeCamera.renderTarget.mapping = new THREE.CubeRefractionMapping();

Edit: tested and it does indeed work.

mrdoob commented 12 years ago

Yup. That did the trick. Thanks :) The visual result is not too impressive though...

spite commented 12 years ago

Cool!

i'll hijack the thread back to the original shader and the annoying seam that appears when reading directly from the panorama texture. It looks like the dFdx/dFdy functions can be used to determine the bias for the texture lookup. Yet, when trying to use those functions, the webgl context reports that the GL_OES_standard_derivatives is available, but the shader fails to compile. The water demo http://madebyevan.com/webgl-water/ works correctly and uses these functions.

Here's a basic use: http://jsfiddle.net/VJca4/. Check the console for "WARNING: 0:26: extension 'GL_OES_standard_derivatives' is not supported". Anybody can see if there's something wrong?

alteredq commented 12 years ago

@spite This looks like some problem with jsFiddle. When I replicated that code locally I managed to get it working without warning.

Just you need to fix two things:

spite commented 12 years ago

Oh! Now I get it!

I thought getExtension was only to check if it was available, and that _#extension GL_OES_standardderivatives : enable was the one that activates the extension for the shaders.

It turns out that getExtension effectively activates the extension on the given context, and the line in the shader is probably just hinting the preprocessor (I suppose).

Cool. Thanks. Now the shader is working, but only gets two and a half quarters of the environment correct.

makc commented 8 years ago

@spite I have been playing with this idea

It looks like the dFdx/dFdy functions can be used to determine the bias for the texture lookup.

basically, doing this:

        gl_FragColor = texture2D (map, vec2 (
            0.5 + phi / 6.2831852,
            theta / 3.1415926
        ), 42.0 * dFdx (phi));

without bias, I am getting this: no bias

with bias: bias

better, but then blurry and does not remove the seam completely. now this thread looks like you had the code somewhere, is there a link to steal it from 😉 ?

spite commented 8 years ago

Besides the first link in this thread... the Floating Shiny Knot, from ages ago https://www.clicktorelease.com/code/streetViewReflectionMapping/#39.36382677360614,8.431220278759724

I did a combination of partial derivatives and texture lookup biasing for my PBR/IBL experiments, but I would have to dive into my disk drive.

What code are you looking for exactly?

makc commented 8 years ago

bias expression

makc commented 8 years ago

actually, I think I don't need dFdx stuff here, -1234.0 * max (0.0, abs (phi) - 3.12) almost removes the seam with no artifacts anywhere else: screen shot 2016-10-27 at 2 12 40

with a little bit more fiddling it will work well for poles too.

makc commented 8 years ago

and, as the bias is getting better, it is also getting obvious that not just the seam, but the whole polar area needs love, since there are details that get needlessly blurred because of larger sampling distance.

better poles

I can't believe noone has the expression for this, it's not like equirectangular maps were invented yesterday.

edit: yet google turned out nothing. Except this one guy who tries to do this, but then just gives up.