ogoguel / realtimehand

Realtime Hand Tracking Unity Package
Other
257 stars 37 forks source link

Request for Guidance on Simulating Reflections in Unity AR Hand-Tracking Project #7

Open ShonubiSamuel opened 2 months ago

ShonubiSamuel commented 2 months ago

Good afternoon,

Thank you once again for your previous response; I truly appreciate your help. I successfully implemented a hand-tracking system for two hands, though I used a different solution. However, your repository was incredibly valuable, especially for calculating depth using GetHumanDistanceFromEnvironment.

My next challenge is simulating reflections on the hand. Simply casting a light near the hand doesn't enhance the experience. I followed your breakdown on estimating normals and managed to estimate the normal of my hand, but I realized that my approach was essentially post-processing and that the estimated normal isn't usable.

Could you provide guidance or a reference on how to simulate reflections on the hand effectively? Here is this normal I estimated:

![Uploading IMG_1828.png…]()

Thank you for your time and assistance.

ogoguel commented 1 month ago

Not sure what you want to acchieve, but as the hand is part of the camera feed, any effects should be postprocessed.

As an example, here's a part of the fragment shader used in the full app :

  1. it checks whether the postprocess pixel is part of the hand or not
  2. it calculates the normal for each pixel ... based on the distance extracted from a preprocessed texture.
    
    inline float3 GetWorldPosFromUVAndDistance(float2 uv, float distance)
    {
    float4 ndc = float4(2.0 * uv - 1.0, 1, 1); 
    float4 viewDir = mul(unity_CameraInvProjection, ndc);
    #if UNITY_REVERSED_Z
        viewDir.z = -viewDir.z;
    #endif
    float3 viewPos = viewDir * distance;
    float3 worldPos = mul(unity_CameraToWorld, float4(viewPos, 1)).xyz;
    return worldPos;
    }

float4 wp = ARKIT_SAMPLE_TEXTURE2D(_WorldPos,sampler_WorldPos,i.texcoord); float worldDistance = wp.a; float3 worldPos = GetWorldPosFromUVAndDistance(screenUV,worldDistance); float3 N = normalize(cross(ddx(worldPos.xyz), ddy(worldPos.xyz)));

Normally, using ddx/ddy directly on the _EnvironmentDepth should  have be sufficient, but I did not manage to make it work. Hence the use of this interim texture, but there might be a better solution_

4. from there, does some fancy calculations ;)
5. returns a color to be added for the postprocess

Hope it helps

Shader "HandShader" { Properties {

    _HumanStencil ("HumanStencil", 2D) = "black" {}
    _HumanDepth ("HumanDepth", 2D) = "black" {}
    _EnvironmentDepth ("EnvironmentDepth", 2D) = "black" {}
    _WorldPos ("WorldPos", 2D) = "black" {}
    _FingerA ("FingerA", Vector) = (0,1,0,0)
    _FingerA ("FingerB", Vector) = (0,1,0,0)
    _ScreenA ("ScreenA", Vector) = (0,1,0,0)
    _ScreenB ("ScreenB", Vector) = (0,1,0,0)

    _Intensity("Intensity", Float) = 1000
    _SpotLight ("SpotLight", Vector) = (0,1,0,0)
    _Distortion("Distortion", Float) = 1
    _Power("Power", Float) = 0.5
    _Scale("Scale", Float) = 0.5
    _Color("Color", Color) = (0,0,1,1)
}
SubShader
{

    Tags
    {
        "Queue" = "Background"
        "RenderType" = "Background"
        "ForceNoShadowCasting" = "True"
    }

    Pass
    {
        Cull Off
        ZWrite Off
        Lighting Off
        LOD 100
        Blend One One

        HLSLPROGRAM

        #pragma vertex vert
        #pragma fragment frag

        #pragma multi_compile_local __ HAND_3DLINE HAND_2DLINE HAND_SUBSURFACE

       struct appdata
        {
            float3 vertex : POSITION;
            float2 texcoord : TEXCOORD0;
        };

        struct v2f
        {
            float4 vertex : SV_POSITION;
            float3 ray  : TEXCOORD1;
            float2 texcoord : TEXCOORD0;
            float2 uv : TEXCOORD2;
            float4 screenPosition : TEXCOORD5;
        };

        float4 _Color;
        float _Intensity;
        float4 _FingerA;
        float4 _FingerB;
        float4 _ScreenA;
        float4 _ScreenB;
        float _Distortion;
        float _Power;
        float _Scale;
        float4 _SpotLight;

        struct fragment_output
        {
            real4 color : SV_Target;
        };

        v2f vert (appdata v)
        {
            v2f o;
            o.vertex  = TransformObjectToHClip(v.vertex);
            float2 texcoord = mul(float3(v.texcoord, 1.0f), _UnityDisplayTransform).xy;

            o.texcoord = texcoord; // remapped
            o.uv = v.texcoord;  // original

            o.screenPosition = ComputeScreenPos(o.vertex);

            return o;
        }

        fragment_output frag (v2f i)
        {
            fragment_output o;
            real4 color ;

            color = real4(0,0,0,0);

            float2 screenUV = i.screenPosition.xy / i.screenPosition.w;
           bool isHuman = ARKIT_SAMPLE_TEXTURE2D(_HumanStencil, sampler_HumanStencil, i.texcoord).r > 0.5h;

            float4  wp = ARKIT_SAMPLE_TEXTURE2D(_WorldPos,sampler_WorldPos,i.texcoord);
            float   worldDistance = wp.a;
            float3  worldPos = GetWorldPosFromUVAndDistance(screenUV,worldDistance);
            float3  N = normalize(cross(ddx(worldPos.xyz), ddy(worldPos.xyz)));

            float3 viewDir = normalize(_WorldSpaceCameraPos.xyz - worldPos.xyz) ;

            float3 L = normalize(_SpotLight - worldPos);
            float3 V = viewDir;

            float3 H = normalize(L + N * _Distortion);
            float d = dot(V,-H);
            float I = pow(saturate(d), _Power) * _Scale;

            float lightDistance = distance(worldPos , _SpotLight);

            float distanceSqr = max(lightDistance*lightDistance, 0.00001);
            float attenuation = 1.0 / distanceSqr;

            color = color + I*_Color*attenuation/_Intensity ;

            o.color = color;
            return o;
        }

        ENDHLSL
    }
}

}

ShonubiSamuel commented 1 month ago

Wow 🤩, Thank you so much for your detailed response and for sharing the fragment shader example! I really appreciate your time and effort in helping me understand the postprocessing of hand effects. Your explanation about calculating normals and using the preprocessed texture makes a lot of sense. Now i have to study and experiment with the approach you mentioned and see how it fits into my project.

Thanks again for the insight—it’s super helpful! 😊 oh it's this your effect i'm trying to achieve 😅 IMG_1231

ShonubiSamuel commented 1 month ago

Hi Sir,

I hope you've been doing well with minimal stress. I just wanted to take a moment to appreciate your help and give you an update on the project. I've successfully simulated the hand, and below is the recording. I also posted it on my LinkedIn if you'd like to check it out.

I understand there’s still room for improvement, and I plan to continue working on those areas.

https://github.com/user-attachments/assets/203e9842-d261-41ac-bce8-36099efbb188

ogoguel commented 2 weeks ago

Top! Keep up the good work

ShonubiSamuel commented 2 weeks ago

🤩 Thank you so much, sir! 🙇 I’ve been meaning to express my appreciation, but school exams kept me busy and away from Unity for a while. Coincidentally, I’ll be done with exams today, and I can’t wait to get back to the project and make improvements.

On Mon, 28 Oct 2024 at 9:59 AM, Olivier Goguel @.***> wrote:

Top! Keep up the good work

— Reply to this email directly, view it on GitHub https://github.com/ogoguel/realtimehand/issues/7#issuecomment-2440924232, or unsubscribe https://github.com/notifications/unsubscribe-auth/APEQHPD3KC3WL4E5D2E2LTTZ5XVHVAVCNFSM6AAAAABN4SBNNSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINBQHEZDIMRTGI . You are receiving this because you authored the thread.Message ID: @.***>