wave-harmonic / crest

A class-leading water system implemented in Unity
MIT License
3.51k stars 476 forks source link

Ocean Meshes doesn't get detected from Post Processing Depth Channel #123

Closed gameanimation3d closed 5 years ago

gameanimation3d commented 6 years ago

Hey,

it seems as the ocean shader uses transperent rendering the camera doesn't detect the meshes within the Depth Channel. So you can't really use Depth of Field on the Camera. Is there a workaround or can you fix this?

huwb commented 6 years ago

I've been waiting for this issue :) yeah it has not been tested with DoF as far as I'm aware, and i can imagine that it does not work. The ocean does not write to the post proc depth, as you have found. We had similar problems with post process fog. This is a fairly fundamental issue with the built-in pipeline. For fog the solution is to do the fog calculations while rendering the water surface (forward), instead of relying on deferred. It's not obvious how to do something like this with DoF!

Also its not just about blurring the water surface - there are surfaces below the water that are visible through the surface, which should get blurred with a separate depth. I think the most general and robust solution would be to render and blur the background surfaces, render and blur the foreground surfaces, and then blend the two together. I don't know if this really works in practice. And it would probably be easier to do this in LWRP/HDRP, which are not supported yet by Crest, but im working towards it..

To actually answer your question - this is way out of my reach, at the current time at least, so it should either be treated as a limitation, or if you want to look at adding this I'd be happy to chip in with suggestions and would obviously love to evaluate any solution we come up to see if it is robust and general enough to absorb back into crest, if that would be an option.

Let me know what you think and then I'll decide what to do with this issue.

huwb commented 6 years ago

One additional thought - if you don't need refraction or underwater it should be possible to render the ocean as an opaque material, which would then render into depth. This means though that the ocean would cut through objects in a harsh way and you wouldnt see anything below the surface.

gameanimation3d commented 6 years ago

Hey,

thank you for the quick answer. Over the last day i checked on some possible solution. First of all for my purpose, blurring the water surface is okay for me. but i still want to have the transparency and refraction. Btw we are using Deferred Rendering because we are using many lights

A possible solution would be to use an replacement shader within the ocean shader. to swap out the rendering mode to opaque and then save the depth buffer of the camera frame and then swap the rendering mode back to transperent and insert the captured depth buffer. Some examples how to: https://forum.unity.com/threads/reuse-depth-buffer-of-main-camera.280460/ https://www.reddit.com/r/Unity3D/comments/7zwn0z/replace_the_depth_buffer_for_vr/

Unlickely i am not familar with shader code :( So i hope it easy for you to get this done. Or i have to try myself or to make this globally by replacing the cameras depthbuffer. It's maybe not the optimal solution because the underwater gets ignored, but a lot of post effects rely on depth like motion blur, ao.

An maybe more complexer solution would be from the guys from Unity ADAM with an transperent helmet shield: https://blogs.unity3d.com/2018/01/24/rendering-and-shading-in-adam-episode-3/ in the "Marian’s visor" Section.

Nevertheless the final and best solution in the end would be to get Crest into HDRP/LDRP

huwb commented 6 years ago

Ah the visor thing is what I was guessing might be possible above, and it was indeed developed in hdrp according to the article (then ported to built in pipeline).

I think that is the way to do it. If they release the source to Adam ep 3 that could be a good way to get started.

gameanimation3d commented 6 years ago

There are already Assets for the corresponding Adam Short available in the Assetstore but the Visor Asset wasn't uploaded. They mentioned the topic shortly in this Talk with some more detailed Images on the settings. https://youtu.be/xKWQBSnhExM?t=1347 at 22:27 but they don't go really in Detail.

gameanimation3d commented 6 years ago

Hey, i tried a workaround solution by using a plane which only contributes to the depth buffer to fix this problem But unlickely the plane blocks for example oceans shader subsurface scattering. I tried already to set the plane shader render queue to 1. To render the plane into the depth buffer at first and after all objects and then afterwards everything else but it doesn't help Nevertheless the workaround works for PostFX.

Do you know how to fix the blocking of the ocean shader?

with plane enabled: vlcsnap-2018-12-01-18h44m59s018

without depth plane: vlcsnap-2018-12-01-18h45m08s937

I am using the https://github.com/RamiroOliva/TranspDepth shader for the plane: One shader contributes to the depth buffer only the other makes the plane only invisble:

This Shader contributes only to the depth buffer:

Shader "TransDepth/JustDepth"
{
    Properties
    {
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
            };

            struct v2f
            {
                float4 vertex : SV_POSITION;
            };

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                discard;
                return 0;
            }
            ENDCG
        }
    }
    Fallback "Diffuse"
}

And this only for the visual appearance

Shader "Custom/InvisibleMask" {
    SubShader{
        // draw after all opaque objects (queue = 2001):
        Tags { "Queue" = "Geometry+1" }
        Pass {
          ColorMask 0
          ZWrite On
        }
    }
}

I hope you know a solution

huwb commented 5 years ago

Hey, interesting approach. I can understand that writing the depth will mess with the subsurface scattering. You are running up against the limits of the built-in render pipeline. It's hard to get what you want due to the way unity stores off the depth for post processing.

The best i can think of is to render your own depth for the bath, and give these depths to the ocean shader, which should allow it to compute scattering correctly. You could then bind your custom rendered depth to the ocean shader using _oceanMaterial.SetTexture("_CameraDepthTexture", tex). You could render these depths using a second camera, or directly by setting up your own render texture and calling Camera.Render(), or using commandbuffers. You may also need to turn off depth testing on the ocean shader so it doesnt clip with your invisible plane.

I guess the other path is to render the ocean completely separately and then composite at the end. This is similar to the visor case above. It might be easier to render using a second camera using layers to separate the views, or maybe doing it with command buffers / custom render code is easier. I'm not sure, its probably going to be a fair bit of work either way.

I'm sorry i can't do more here - this needs dedicated coding love from someone with much more time than myself. I can apply the help wanted label to this issue in case a coder feels like collaborating with you if it helps? I've not seen this happen so far but it might be worth a shot.

gameanimation3d commented 5 years ago

Hey,

i implemented a second camera which only renders the depth plane to custom depth render texture. I tried to set the depth into the shader but this didn't worked for me because the depth would be replaced afterwards by the scene. Therefore my approach was to inject the custom depth buffer Before the Image Effects happens. I already tried around by using Commandbuffer, setRenderTarget/setTargetBuffer but i wasn't able to combine the colorBuffer from the Maincamera with the custom depthBuffer.

I think the best way would be to use OnRenderImage() which get called before the image effects happened. By using the Blit Function combining the Colorbuffer from the MainCamera with Custom DepthBuffer should be possible. But i am strugeling with the shader for the blit function which only reads the depth information and returns it to the rendertexture. Which then can be passed to destination RenderTexture which is then used for the ImageEffects. Or do know a better solution for combining two RenderTextures?

huwb commented 5 years ago

Hi there, thanks for the info.

Yeah it sounds tricky, i always struggle to keep all of these things in my head at one time.

Is unity's postprocess depth bound on the GPU when OnRenderImage() is called? I would imagine it is. Perhaps you can then update this depth by doing a blit with a shader like this:

https://github.com/Unity-Technologies/ScriptableRenderPipeline/blob/e24fef6e2fa7320dd6dd7f58c0a69e165b776d80/com.unity.render-pipelines.lightweight/Shaders/Utils/CopyDepth.shader

This shader writes depth only, and will sample whatever custom depth you give it. You could set the blend mode as desired - if you want to combine the two depths you could use a Max blend mode (as the depth buffer is inverted - 0 is the background, 1 is the foreground). This hopefully allows you to shove your info into the depth buffer.

Another alternative might be to build the postprocessing from source, in which case you may be able to use whatever depth you like. https://github.com/Unity-Technologies/PostProcessing/wiki/Installation . Maybe you could add an optional field to the Depth Of Field post effect which takes a custom depth buffer.

gameanimation3d commented 5 years ago

Hey, thank you very much for the solution. But the shader works only with the new RenderPipeline or ? Because it requieres several Core .hlsl files from the SRP. Or do I only need the corresponding files from the SRP.

huwb commented 5 years ago

Yeah you'll need to bring it in. I would probably just copy the bits of code you need directly into the shader, but you can also bring in the hlsl files in the same folder structure and see if that works.

huwb commented 5 years ago

Closing due to inactivity. If you want it reopened let me know.

Its worth calling out that there is an example of manually managing unitys depth buffer in #193, although i dont know if it helps exactly here, but it might give some ideas.