go-gl / gl

Go bindings for OpenGL (generated via glow)
MIT License
1.06k stars 72 forks source link

A dangerous function #128

Open aabbtree77 opened 3 years ago

aabbtree77 commented 3 years ago

I stumbled upon one obscurity with gl.BufferData and decided to post this here in case someone else (or future me) hits it.

Suppose we have a simple mesh to be rendered such as the quad whose triangle vertices are

inds := []uint{0 1 2 1 3 2}

Then one uploads this data to the GPU buffer:

var buffID uint32
gl.GenBuffers(1, &buffID)
gl.BindBuffer(gl.ELEMENT_ARRAY_BUFFER, buffID)
gl.BufferData(gl.ELEMENT_ARRAY_BUFFER, len(inds)*int(reflect.TypeOf(inds).Elem().Size()),  gl.Ptr(inds), gl.STATIC_DRAW)    

Here len and Elem are used to get the size of inds in bytes automatically. They are needed as the direct reflect.TypeOf(inds).Size() does not work on the slice, it gives a wrong size of 24 bytes (the correct one is 48 due to uint being of 8 bytes here on my machine).

Renderdoc reveals that on the GPU the mesh gets uploaded as

{0 0 1 0 2 0}

which destroys the mesh and the subsequent rendering silently without issuing any compiler warnings or OpenGL errors.

The solution is to change the type from uint, which is on my 64 bit Ubuntu 20.04 is of size 8 bytes, to the type uint32 which is of size 4 bytes.

inds := []uint32{0 1 2 1 3 2}

Everything works fine this way.

Interestingly, the analogous expressions in C++ did not produce any rendering problems on my machine as unsigned int produced by gcc on Ubuntu 20.04 was 4 bytes. Consider this as a warning not to employ those undefined integer sizes such as int and uint with the OpenGL buffer functions that use gl.Ptr or unsafe.Pointer. This is also one more way to get undefined/erroneous behavior/blank screens without any error message.

aabbtree77 commented 3 years ago

I have just checked that some C++ (akin to learnOpenGL) codes also break with other sizes but uint32_t. So for instance this works:

glBufferData(GL_ELEMENT_ARRAY_BUFFER, inds.size()*sizeof(std::uint32_t), &inds[0], GL_STATIC_DRAW);

whenever inds is of the type std::vector<>, but that does not:

glBufferData(GL_ELEMENT_ARRAY_BUFFER, inds.size()*sizeof(std::uint64_t), &inds[0], GL_STATIC_DRAW);

when inds is of the type std::vector<>. Again, as in the both cases above (uint vs uint32 in Go), no complaints by the compiler or the OpenGL error checking system, but the first case yields the correct result while the second one leads to the incorrect mesh loading and a black screen.

So ideally an API should let this know, that only 32 bit ints can be specified as vertex indices, otherwise at least on my machine (Ubuntu 20.04+GTX 760) I get an undefined behavior checked both via C++ and Go code bases.

So those learnopengl.com codes are not safe in this sense either, it's likely not a good idea to use platform-dependent integers with such low level graphics APIs.