expipiplus1 / vulkan

Haskell bindings for Vulkan
https://hackage.haskell.org/package/vulkan
BSD 3-Clause "New" or "Revised" License
139 stars 31 forks source link

Casting to INT gives wrong result #523

Closed largeword closed 3 months ago

largeword commented 3 months ago

I wrote the following GLSL code tried to fill the buffer with global id

#version 450
#extension GL_ARB_separate_shader_objects : enable

layout(local_size_x = 256, local_size_y = 1, local_size_z = 1) in;

layout(std430, binding = 1) buffer lay1 { int outbuf[]; };

void main() {
  const uint id = gl_GlobalInvocationID.x; // current offset
  outbuf[id] = int(gl_GlobalInvocationID.x);
}

The output buffer is of size 10, vulkan version is API_VERSION_1_3, ghc-9.2.5, vulkan binding 3.26.1 from hackage, vulkan API 1.3.277. My GPU is RTX 3090 with driver 550.67.

When executing the GLSL code, it gives me [4294967296,12884901890,21474836484,30064771078,38654705672,47244640266,55834574860,64424509454,73014444048,81604378642], which is clearly wrong and I don't think the conversion would exceed the range of INT. This error does not happen if I write C++ code and compile GLSL manually, but if I call the compiled spv binary file in Haskell, the error happens again.

If I change the output type to FLOAT, and use float(gl_GlobalInvocationID.x) to cast the data, the results are correct.I also tried int(float(gl_GlobalInvocationID.x)), but the same wrong result.

I attached the full code, can someone kindly point out a solution or the mistakes I made? code.txt

largeword commented 3 months ago

Problem solved, if I change all Int to Int32 in Haskell, it would give me the correct result. But the default Int should be just Int32, how would this happen? Is there a better way to solve it?

I wrote the following GLSL code tried to fill the buffer with global id

#version 450
#extension GL_ARB_separate_shader_objects : enable

layout(local_size_x = 256, local_size_y = 1, local_size_z = 1) in;

layout(std430, binding = 1) buffer lay1 { int outbuf[]; };

void main() {
  const uint id = gl_GlobalInvocationID.x; // current offset
  outbuf[id] = int(gl_GlobalInvocationID.x);
}

The output buffer is of size 10, vulkan version is API_VERSION_1_3, ghc-9.2.5, vulkan binding 3.26.1 from hackage, vulkan API 1.3.277. My GPU is RTX 3090 with driver 550.67.

When executing the GLSL code, it gives me [4294967296,12884901890,21474836484,30064771078,38654705672,47244640266,55834574860,64424509454,73014444048,81604378642], which is clearly wrong and I don't think the conversion would exceed the range of INT. This error does not happen if I write C++ code and compile GLSL manually, but if I call the compiled spv binary file in Haskell, the error happens again.

If I change the output type to FLOAT, and use float(gl_GlobalInvocationID.x) to cast the data, the results are correct.I also tried int(float(gl_GlobalInvocationID.x)), but the same wrong result.

I attached the full code, can someone kindly point out a solution or the mistakes I made? code.txt

dpwiz commented 3 months ago

Yeah, host Int may be different from GPU int (which is int32 indeed). Don't use unsized data when talking to a GPU. sizeOf (0 :: Int) would get it wrong, and so getElemOff.