Unity-Technologies / arfoundation-samples

Example content for Unity projects based on AR Foundation
Other
3.07k stars 1.15k forks source link

ARCore 5.0.2 Occlusion Shader accessing _CameraDepthTexture #1012

Closed valantano closed 1 year ago

valantano commented 2 years ago

Hello Guys,

I am working on a project in AR Foundation in which I need to create my own Occlusion Shader. Below is a modified copy of the Unlit/ARCoreBackground/AfterOpaques Shader which is provided by the Google ARCore XR Plugin (v.5.0.2). I did not touch the first pass but only the second. But there is a problem in the first pass: textureCoordQuad does not work properly (if I use it the texture gets pink). I actually don't know what the first pass is doing or if it is even working. However in the second pass I am able to get the CameraDepthTexture as Background texture (the texture which would normally contain the camera image), but since textureCoordQuad is not working I can only use the textureCoord of the EnvironmentTexture which is why the Depth Texture is not correctly displayed. Can someone please help me with this shader? How do I work around the textureCoordQuad problem to get the textureCoords for the CameraDepthTexture? And maybe someone can tell me what the first pass is supposed to do. Maybe someone has a good tutorial available? I know how vertex and fragment shader work, but I am not deep into OpenGLES3 shader but sadly the default Unlit shader of Unity don't seem to work on Android, although it is said that they would be converted automatically and although on the AR Foundation Website there is a pseudocode example: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/migration-guide-5-x.html

Thanks in advance.

Shader "Unlit/ARCoreBackground/AfterOpaques"
{
    Properties
    {
        _MainTex("Texture", 2D) = "white" {}
        _EnvironmentDepth("Texture", 2D) = "black" {}
    }

    SubShader
    {
        Name "ARCore Background (After Opaques)"
        Tags
        {
            "Queue" = "Background"
            "RenderType" = "Background"
            "ForceNoShadowCasting" = "True"
        }

        Pass
        {
            Name "ARCore Background Occlusion Handling"
            Cull Off
            ZTest GEqual
            ZWrite On
            Lighting Off
            LOD 100
            Tags
            {
                "LightMode" = "Always"
            }

            GLSLPROGRAM

            #pragma multi_compile_local __ ARCORE_ENVIRONMENT_DEPTH_ENABLED

            #pragma only_renderers gles3

            #include "UnityCG.glslinc"

#ifdef SHADER_API_GLES3
#extension GL_OES_EGL_image_external_essl3 : require
#endif // SHADER_API_GLES3

            // Device display transform is provided by the AR Foundation camera background renderer.
            uniform mat4 _UnityDisplayTransform;

#ifdef VERTEX
            varying vec2 textureCoord;

            void main()
            {
#if defined(SHADER_API_GLES3) && defined(ARCORE_ENVIRONMENT_DEPTH_ENABLED)
                // Transform the position from object space to clip space.
                gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

                // Get quad uvs for sampling camera texture
                textureCoordQuad = gl_MultiTexCoord0;

                // Remap the texture coordinates based on the device rotation.
                textureCoordEnvironment = (_UnityDisplayTransform * vec4(gl_MultiTexCoord0.x, 1.0f - gl_MultiTexCoord0.y, 1.0f, 0.0f)).xy;
#endif // SHADER_API_GLES3 && ARCORE_ENVIRONMENT_DEPTH_ENABLED
            }
#endif // VERTEX

#ifdef FRAGMENT
            varying vec2 textureCoordQuad;
            varying vec2 textureCoordEnvironment;
            uniform float _UnityCameraForwardScale;

#ifdef ARCORE_ENVIRONMENT_DEPTH_ENABLED
            uniform sampler2D _EnvironmentDepth;
            uniform sampler2D _CameraDepthTexture;
#endif // ARCORE_ENVIRONMENT_DEPTH_ENABLED

            float ConvertDistanceToDepth(float d)
            {
                d = _UnityCameraForwardScale > 0.0 ? _UnityCameraForwardScale * d : d;

                float zBufferParamsW = 1.0 / _ProjectionParams.y;
                float zBufferParamsY = _ProjectionParams.z * zBufferParamsW;
                float zBufferParamsX = 1.0 - zBufferParamsY;
                float zBufferParamsZ = zBufferParamsX * _ProjectionParams.w;

                // Clip any distances smaller than the near clip plane, and compute the depth value from the distance.
                return (d < _ProjectionParams.y) ? 1.0f : ((1.0 / zBufferParamsZ) * ((1.0 / d) - zBufferParamsW));
            }

            void main()
            {
#if defined(SHADER_API_GLES3) && defined(ARCORE_ENVIRONMENT_DEPTH_ENABLED)

                vec3 result = texture(_MainTex, textureCoord).xyz;

                float depth = texture(_CameraDepthTexture, textureCoordQuad).x;

                float distance = texture(_EnvironmentDepth, textureCoordEnvironment).x;
                float environmentDepth = ConvertDistanceToDepth(distance);

#ifndef UNITY_COLORSPACE_GAMMA
                result = GammaToLinearSpace(result);
#endif // !UNITY_COLORSPACE_GAMMA

                if (depth >= environmentDepth)
                {
                    discard;
                }

                gl_FragColor = vec4(result, 1.0);
                gl_FragDepth = depth;
#endif // SHADER_API_GLES3 && ARCORE_ENVIRONMENT_DEPTH_ENABLED
            }

#endif // FRAGMENT
            ENDGLSL
        }

        Pass
        {
            Name "AR Camera Background (ARCore)"
            Cull Off
            ZTest LEqual
            ZWrite On
            Lighting Off
            LOD 100
            Tags
            {
                "LightMode" = "Always"
            }

            GLSLPROGRAM

            #pragma multi_compile_local __ ARCORE_ENVIRONMENT_DEPTH_ENABLED

            #pragma only_renderers gles3

            #include "UnityCG.glslinc"

#ifdef SHADER_API_GLES3
#extension GL_OES_EGL_image_external_essl3 : require
#endif // SHADER_API_GLES3

// Device display transform is provided by the AR Foundation camera background renderer.
uniform mat4 _UnityDisplayTransform;

#ifdef VERTEX
            varying vec2 textureCoord;

            void main()
            {
#ifdef SHADER_API_GLES3
                // Transform the position from object space to clip space.
                gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
                textureCoordQuad = gl_MultiTexCoord0;
                // Remap the texture coordinates based on the device rotation.
                textureCoord = (_UnityDisplayTransform * vec4(gl_MultiTexCoord0.x, 1.0f - gl_MultiTexCoord0.y, 1.0f, 0.0f)).xy;
#endif // SHADER_API_GLES3
            }
#endif // VERTEX

#ifdef FRAGMENT
            varying vec2 textureCoord;
            uniform samplerExternalOES _MainTex;
            uniform float _UnityCameraForwardScale;
            varying vec2 textureCoordQuad;

#ifdef ARCORE_ENVIRONMENT_DEPTH_ENABLED
            uniform sampler2D _EnvironmentDepth;
            uniform sampler2D _CameraDepthTexture;
#endif // ARCORE_ENVIRONMENT_DEPTH_ENABLED

#if defined(SHADER_API_GLES3) && !defined(UNITY_COLORSPACE_GAMMA)
            float GammaToLinearSpaceExact(float value)
            {
                if (value <= 0.04045F)
                    return value / 12.92F;
                else if (value < 1.0F)
                    return pow((value + 0.055F) / 1.055F, 2.4F);
                else
                    return pow(value, 2.2F);
            }

            vec3 GammaToLinearSpace(vec3 sRGB)
            {
                // Approximate version from http://chilliant.blogspot.com.au/2012/08/srgb-approximations-for-hlsl.html?m=1
                return sRGB * (sRGB * (sRGB * 0.305306011F + 0.682171111F) + 0.012522878F);

                // Precise version, useful for debugging, but the pow() function is too slow.
                // return vec3(GammaToLinearSpaceExact(sRGB.r), GammaToLinearSpaceExact(sRGB.g), GammaToLinearSpaceExact(sRGB.b));
            }

#endif // SHADER_API_GLES3 && !UNITY_COLORSPACE_GAMMA

            float ConvertDistanceToDepth(float d)
            {
                d = _UnityCameraForwardScale > 0.0 ? _UnityCameraForwardScale * d : d;

                float zBufferParamsW = 1.0 / _ProjectionParams.y;
                float zBufferParamsY = _ProjectionParams.z * zBufferParamsW;
                float zBufferParamsX = 1.0 - zBufferParamsY;
                float zBufferParamsZ = zBufferParamsX * _ProjectionParams.w;

                // Clip any distances smaller than the near clip plane, and compute the depth value from the distance.
                return (d < _ProjectionParams.y) ? 1.0f : ((1.0 / zBufferParamsZ) * ((1.0 / d) - zBufferParamsW));
            }

            void main()
            {
#ifdef SHADER_API_GLES3
                vec3 result = texture(_MainTex, textureCoord).xyz;
                float depth = 1.0;

#ifdef ARCORE_ENVIRONMENT_DEPTH_ENABLED
                float depthTT = texture(_CameraDepthTexture, textureCoord).x;
                float distance = texture(_EnvironmentDepth, textureCoord).x;
                depth = ConvertDistanceToDepth(distance);
#endif // ARCORE_ENVIRONMENT_DEPTH_ENABLED

#ifndef UNITY_COLORSPACE_GAMMA
                result = GammaToLinearSpace(result);
#endif // !UNITY_COLORSPACE_GAMMA

                gl_FragColor = vec4(result, 1.0);
                gl_FragDepth = depth; //depth =0.0 only background is displayed depth=1.0 virtual objects in front of background
                gl_FragDepth = 0.0;

#ifdef ARCORE_ENVIRONMENT_DEPTH_ENABLED
                gl_FragColor = vec4(depthTT, 0.0, 0.0, 1.0);
                gl_FragDepth = 0.0;
#endif

#endif // SHADER_API_GLES3
            }

#endif // FRAGMENT
            ENDGLSL
        }
    }

        FallBack Off
}
DavidMohrhardt commented 2 years ago

Just to start by answering some questions here:

Something I noticed is that the declaration for textureCoordQuad is not declared in the right scope for the vertex shader. You could try redeclaring the varying vec2 above the main function in Vertex.

First, the first pass in the subshader is used to minimize the overdraw in the AfterOpaques rendering scenario when occlusion is also enabled. Basically, since all scene content has been rendered prior to background rendering, AR Foundation needs to determine where to render the background over the geometry that has been rendered and the easiest way to accomplish this is to compare the depth buffer against the environment depth value for that pixel and if the environment depth is in front of the geometry then overwrite that pixel color with the camera background.

ankur-unity commented 1 year ago

Duplicate of #1017. A fix has been implemented for this issue. See https://github.com/Unity-Technologies/arfoundation-samples/issues/1017#issuecomment-1453782949 and https://github.com/Unity-Technologies/arfoundation-samples/issues/1017#issuecomment-1559356323.