shader-slang / slang

Making it easier to work with shaders
http://shader-slang.com
MIT License
2.16k stars 186 forks source link

Incorrect declaration of 64 bit static arrays #5458

Open Dynamitos opened 5 days ago

Dynamitos commented 5 days ago

I have found a very perplexing problem, which occurs when declaring a 64 bit global array as a bitmask.

struct ComputeParams
{
    RWStructuredBuffer<uint64_t> bitFieldBuffer;
};
ParameterBlock<ComputeParams> pParams;

static const uint64_t OCBT_bit_mask[18] = { 0xffffffff, // Root 17
                            0xffffffff, // Level 16
                            0xffffffff, // level 15
                            0xffffffff, // level 14
                            0xffffffff, // level 13
                            0xffffffff, // level 12
                            0xffffffff, // level 11

                            0xffff, // level 10
                            0xffff, // level 9
                            0xffff, // level 8
                            0xff, // level 8

                            0xffffffffffffffff, // level 7
                            0xffffffff, // Level 6
                            0xffff, // level 5
                            0xff, // level 4
                            0xf, // level 3
                            0x3, // level 2
                            0x1, // level 1
};

[numthreads(1, 1, 1)]
[shader("compute")]
void TestHeap(uint dispatchID: SV_DispatchThreadID)
{
    pParams.bitFieldBuffer[dispatchID] = OCBT_bit_mask[dispatchID];
}

This produces the following GLSL code:

    #version 450
#extension GL_EXT_shader_explicit_arithmetic_types_int64 : require
layout(row_major) uniform;
layout(row_major) buffer;

#line 34 0
layout(std430, binding = 0) buffer StructuredBuffer_uint64_t_0 {
    uint64_t _data[];
} pParams_bitFieldBuffer_0;

#line 8
const uint64_t  OCBT_bit_mask_0[18] = { 18446744073709551615UL, 18446744073709551615UL, 18446744073709551615UL, 18446744073709551615UL, 18446744073709551615UL, 18446744073709551615UL, 18446744073709551615UL, 65535UL, 65535UL, 65535UL, 255UL, 18446744073709551615UL, 18446744073709551615UL, 65535UL, 255UL, 15UL, 3UL, 1UL };

#line 32
layout(local_size_x = 1, local_size_y = 1, local_size_z = 1) in;
void main()
{

#line 32
    uint _S1 = gl_GlobalInvocationID.x;

    pParams_bitFieldBuffer_0._data[uint(_S1)] = OCBT_bit_mask_0[_S1];
    return;
}

In the GLSL output, the array has replaced the 32 bit masks with 64 bit masks. The problem persits when compiling with emit-spirv-directly and decompiling with spriv-cross

csyonghe commented 5 days ago

You can use ULL suffix on the literals for now to workaround this bug.

csyonghe commented 5 days ago

Note: the problem is that slang is incorrectly treating all int literals without suffix as signed 32-bit int. This is inconsistent with most other languages, where hexadecimal literals are always treated as unsigned int. Slang should do the same here.

Dynamitos commented 5 days ago

That is unexpected, thanks for the info