oculus-samples / Unity-DepthAPI

Examples of using Depth API for real-time, dynamic occlusions
Other
167 stars 24 forks source link

Using Shader Graph #3

Closed qlee01 closed 6 months ago

qlee01 commented 8 months ago

Hey, is there a way to use shader graph for this? Is it possible to wire the given vertex / fragment pieces to the corresponding stages in shader graph?

vasylbo commented 8 months ago

It should be possible, but we didn't try yet. It's already in our backlog to add shadergraph support in the urp package.

qlee01 commented 8 months ago

It should be possible, but we didn't try yet. It's already in our backlog to add shadergraph support in the urp package.

thanks! I will try a bit, have a rough idea how to do it, just wanted to ask because this could have saved a few hours of work, especially as testing seems only be possible on build / device right now.

velickolb commented 8 months ago

It should be possible, but we didn't try yet. It's already in our backlog to add shadergraph support in the urp package.

thanks! I will try a bit, have a rough idea how to do it, just wanted to ask because this could have saved a few hours of work, especially as testing seems only be possible on build / device right now.

Hey I gave it a try but didn't went far with it. Could you give me some directions on how would I do it in Shader Graph?

qlee01 commented 8 months ago

It should be possible, but we didn't try yet. It's already in our backlog to add shadergraph support in the urp package.

thanks! I will try a bit, have a rough idea how to do it, just wanted to ask because this could have saved a few hours of work, especially as testing seems only be possible on build / device right now.

Hey I gave it a try but didn't went far with it. Could you give me some directions on how would I do it in Shader Graph?

I also did not succeed yet. Still have some mixup of position data, not clear where to use object, world or screen space. I can see some effect, but it seems to be mirrored and not stable in space. In principal you need two custom functions I guess, one connected to position node in vertex stage, and one to alpha or alpha threshold in fragment stage, using alpha clipping. You can use custom interpolater to send data from vertex to fragment stage.

BigStoagie commented 8 months ago

It should be possible, but we didn't try yet. It's already in our backlog to add shadergraph support in the urp package.

thanks! I will try a bit, have a rough idea how to do it, just wanted to ask because this could have saved a few hours of work, especially as testing seems only be possible on build / device right now.

Hey I gave it a try but didn't went far with it. Could you give me some directions on how would I do it in Shader Graph?

I also did not succeed yet. Still have some mixup of position data, not clear where to use object, world or screen space. I can see some effect, but it seems to be mirrored and not stable in space. In principal you need two custom functions I guess, one connected to position node in vertex stage, and one to alpha or alpha threshold in fragment stage, using alpha clipping. You can use custom interpolater to send data from vertex to fragment stage.

So I've been messing around with trying to get this working in shadergraph and I seem to be having trouble with utilizing the function that calculates the occlusion value. If I use a custom node to write a function that calls the function meta provides it gives me an error saying something about a redefinition of _Time at line 40. I was just wondering if you could provide any guidance if you've gotten that function to work.

vasylbo commented 8 months ago

We haven't started investigating it yet. The redefinition of _Time sounds like there is a conflict of includes in shader graph files and the code in the custom node. It's possible that to implement it, we'll require to refactor the hlsl files that we provide. I'll update here when we have more info on this.

Ricimon commented 8 months ago

I managed to somewhat get a shader graph implementation working, but for some reason the occlusion doesn't work if the occluding object is too far from the camera.

This is the Occlusion subgraph that outputs an occlusionValue. It's got 2 custom functions, one to get positionCS and positionNDC, and one to get the occlusionValue.

\

Occlusion.hlsl

#ifndef OCCLUSION_INCLUDED
#define OCCLUSION_INCLUDED

#ifndef SHADERGRAPH_PREVIEW
    #include "Packages/com.meta.xr.depthapi.urp/Shaders/EnvironmentOcclusionURP.hlsl"
#endif

#pragma multi_compile _ HARD_OCCLUSION SOFT_OCCLUSION

void CalculateEnvironmentDepthOcclusion_float (float2 uv, float sceneDepth, out float occlusionValue){
    #ifndef SHADERGRAPH_PREVIEW
        occlusionValue = CalculateEnvironmentDepthOcclusion(uv, sceneDepth);
    #else
        occlusionValue = 1;
    #endif
}

void CalculateEnvironmentDepthOcclusion_half (half2 uv, half sceneDepth, out half occlusionValue){
    #ifndef SHADERGRAPH_PREVIEW
        occlusionValue = CalculateEnvironmentDepthOcclusion(uv, sceneDepth);
    #else
        occlusionValue = 1;
    #endif
}

#endif // OCCLUSION_INCLUDED

The subgraph is then added to a shader graph like so

image

This works when the occluding object is close to the camera, but for some reason not when it's a little far. The white box on the left is the included code URP Unlit, while the gray box on the right is the shader graph.

Any ideas on what's wrong here?

velickolb commented 8 months ago

Any ideas on what's wrong here?

I tried playing with the values in the EnvironmentDepthTextureProvider.cs. Setting the custom depthFarZ and depthFarZ values seems to have an affect but not the expected one.

NicolasAbo17 commented 8 months ago

I tried this and the object just dissapear for me, I am multiplying the occlussion to my original ambient color. It would be good to have meta feedback on this shader graph, just to see how to get it working.

image

TudorJude commented 7 months ago

Hello everyone,

Sorry for the late reply on this issue. There will be a subgraph that handles occlusions with examples in the next update. For those of you that can't wait until then here's how you can implement it yourselves:

1. Create an occlusion subgraph like so:

image

The CalculateEnvironmentDepthOcclusion node is a custom function node that has this hlsl code in it:

#pragma multi_compile _ HARD_OCCLUSION SOFT_OCCLUSION

#ifndef SHADERGRAPH_PREVIEW
#include "Packages/com.meta.xr.depthapi/Runtime/URP/EnvironmentOcclusionURP.hlsl"
#endif

void CalculateEnvironmentDepthOcclusion_float(float3 posWorld, out float occlusionValue)
{
#ifndef SHADERGRAPH_PREVIEW
    occlusionValue = META_DEPTH_GET_OCCLUSION_VALUE_WORLDPOS(posWorld, 0.0);
#else
    occlusionValue = 1;
#endif
}

void CalculateEnvironmentDepthOcclusion_half(float3 posWorld, out half occlusionValue)
{
#ifndef SHADERGRAPH_PREVIEW
    occlusionValue = META_DEPTH_GET_OCCLUSION_VALUE_WORLDPOS(posWorld, 0.0);
#else
    occlusionValue = 1;
#endif
}

Graph inspector looks like this for it: image

2. Add it to your Shadergraph. Here's an example:

image

In this instance I enable alpha clipping and set its threshold to 0. 0 alpha pixels will be occluded.

Spphire commented 6 months ago

Hello everyone,

Sorry for the late reply on this issue. There will be a subgraph that handles occlusions with examples in the next update. For those of you that can't wait until then here's how you can implement it yourselves:

1. Create an occlusion subgraph like so:

image

The CalculateEnvironmentDepthOcclusion node is a custom function node that has this hlsl code in it:

#pragma multi_compile _ HARD_OCCLUSION SOFT_OCCLUSION

#ifndef SHADERGRAPH_PREVIEW
#include "Packages/com.meta.xr.depthapi/Runtime/URP/EnvironmentOcclusionURP.hlsl"
#endif

void CalculateEnvironmentDepthOcclusion_float(float3 posWorld, out float occlusionValue)
{
#ifndef SHADERGRAPH_PREVIEW
    occlusionValue = META_DEPTH_GET_OCCLUSION_VALUE_WORLDPOS(posWorld, 0.0);
#else
    occlusionValue = 1;
#endif
}

void CalculateEnvironmentDepthOcclusion_half(float3 posWorld, out half occlusionValue)
{
#ifndef SHADERGRAPH_PREVIEW
    occlusionValue = META_DEPTH_GET_OCCLUSION_VALUE_WORLDPOS(posWorld, 0.0);
#else
    occlusionValue = 1;
#endif
}

Graph inspector looks like this for it: image

2. Add it to your Shadergraph. Here's an example:

image

In this instance I enable alpha clipping and set its threshold to 0. 0 alpha pixels will be occluded.

Can it now be tested on Oculus-Link? and what's the mean of the second param(0.0) of function:

occlusionValue = META_DEPTH_GET_OCCLUSION_VALUE_WORLDPOS(posWorld, 0.0);
TudorJude commented 6 months ago

Hey Spphire,

Spphire commented 6 months ago

Hey Spphire,

  • We will release an update very soon that will have Link support.
  • The second parameter is the environment depth bias, you can expose that in your subgraph as well if you wish. Environment depth bias is explained in the readme and there's a sample scene called "SceneAPIPlacement" that showcases its usage.

Thanks for reply!

I tried it today, I wondered if quest2 can use it?

I tried it both with oculus-link and building and installing into quest2,

oculus-link: the cube with shadergraph seems normal in unity-editor, but when it start running, the cube become totally transparent

install in quest2: the cube is totally transparent after running

so maybe quest2 cant use it?

TudorJude commented 6 months ago

Depth API is only supported on Quest 3.

TudorJude commented 6 months ago

Closing issue, the solution is now up on this repo.