keijiro / Pcx

Point cloud importer & renderer for Unity
The Unlicense
1.36k stars 197 forks source link

color encoding not working correctly on kinect rgb images #88

Open AR-Vsx opened 1 year ago

AR-Vsx commented 1 year ago

Hi, I'm trying to render Pointclouds from Depth and RGB images which are captured by the azure kinect dk. While using the compute buffer method of the PointCloudRenderer the positions are rendered correctly but the color encoding isn't working correctly.

This is the compute shader I'm using:

#pragma kernel BakeVertexTex

uint2 DepthRes;
float2 DepthScale;

uint MinDepth;
uint MaxDepth;

StructuredBuffer<float> SpaceTable;
StructuredBuffer<uint> DepthMap;

RWStructuredBuffer<float4> PointCloudVertexBuffer;
RWTexture2D<float4> ColorRenderTexture;

[numthreads(8, 8, 1)]
void BakeVertexTex(uint3 id : SV_DispatchThreadID)
{
        //point position calculation
    uint i = id.x + id.y * DepthRes.x;

    //bool isOdd = i % 2 == 1;
    //uint iDepth2 = DepthMap[i >> 1];
    //uint iDepth = (isOdd ? iDepth2 <<= 16 : iDepth2) >> 16;

    uint depth2 = DepthMap[i >> 1];
    uint depth = (i % 2 == 0 ? depth2 <<= 16 : depth2) >> 16;
    depth = depth >= MinDepth && depth <= MaxDepth ? depth : 0;

    float fDepth = (float)depth / 1000;
    bool mask = depth != 0;

    float3 pos = float3(
        SpaceTable[i * 3] * fDepth * DepthScale.x,
        SpaceTable[i * 3 + 1] * fDepth * DepthScale.y,
        mask ? fDepth : 1000
        );

        //Color Encoding
    half3 rgb = asuint(ColorRenderTexture[id.xy]);

    half y = max(max(rgb.r, rgb.g), rgb.b);
    y = clamp(ceil(y * 255 / 16), 1, 255);
    rgb *= 255 * 255 / (y * 16);
    uint4 n = half4(rgb, y);
    uint m = ((uint)rgb.r) | ((uint)rgb.g << 8) | ((uint)rgb.b << 16) | ((uint)y << 24);

        //Write position and color in output buffer
    PointCloudVertexBuffer[i] = float4(pos, m);
}

Shader dispatch:

        SetComputeBufferData(pointCloudDepthBuffer, depthImage, depthImage.Length >> 1, sizeof(uint));
        SetComputeShaderInt2(pointCloudVertexShader, "DepthRes", width, height);
        SetComputeShaderFloat2(pointCloudVertexShader, "DepthScale", -1, -1);

        pointCloudVertexShader.SetInt("MinDepth", (int)(0.5f * 1000f));
        pointCloudVertexShader.SetInt("MaxDepth", (int)(10 * 1000f));
        pointCloudVertexShader.SetBuffer(pointCloudVertexKernel, "SpaceTable", pointCloudSpaceBuffer);
        pointCloudVertexShader.SetBuffer(pointCloudVertexKernel, "DepthMap", pointCloudDepthBuffer);
        pointCloudVertexShader.SetBuffer(pointCloudVertexKernel, "PointCloudVertexBuffer", pointCloudVertexBuffer);
        pointCloudVertexShader.SetBuffer(pointCloudVertexKernel, "PointCloudColorBuffer", pointCloudColorBuffer);
        pointCloudVertexShader.SetTexture(pointCloudVertexKernel, "ColorRenderTexture", colorRenderTexture);
        pointCloudVertexShader.Dispatch(pointCloudVertexKernel, width / 8, height / 8, 1);

        PointCloudRenderer.Instance.sourceBuffer = pointCloudVertexBuffer;

I took the color encoding part from the common.cginc of this project.

This is the result when using the PointCloudRendererwith the PointCloudVertexBuffer: pointcloud

I've confirmed that the ColorRenderTextureis displayed correctly and the render texture format is ARGB32. I've tried switching the channels around but no luck so far.