vaiorabbit / ruby-opengl

Yet another OpenGL wrapper for Ruby (and wrapper code generator).
Other
88 stars 12 forks source link

glGetProgramBinary - Strange.. #31

Closed samuelintheforest closed 3 years ago

samuelintheforest commented 3 years ago

Hello,

I'd recently run into an issue when I tried to save my shader program into a binary file.

Here is the code snippet:

def write_binary()

  binary_length = ' ' * 4
  binary_format = ' ' * 4
  glGetProgramiv(@id, GL_PROGRAM_BINARY_LENGTH_OES, binary_length)
  binary_length = binary_length.unpack('l')[0]

  binary = ' ' * binary_length
  glGetProgramBinaryOES(@id, binary_length.to_i, nil, binary_format, binary)

  file = File.new( 'shader.bin' , 'w')
  file.puts binary.unpack('l')[0]
  file.close  

end

@id -> my shader program

For some reason binary_length was always 0, and as a consequence binary was as well (and this resulted the generation of a 1byte large file always). Have you any idea what could go wrong? Perhaps, I used memory allocation wrongly, or the declaration of binary_length wasn't appropriately done?

Apart from that, I'm glad that you've made this wrapper for both GL and GLFW. I've been using it like for 7 months now. It is awesome!!! Although I have still a plenty of questions about memory management, string unpack and array pack, and the proper usage of these but all in all I'm satisfied.

A BIg THANK YOU for this! :D

Greetings, Samuel

vaiorabbit commented 3 years ago

Hello,

Have you checked that your environment is capable of running OpenGL ES? You can find OpenGL ES APIs at lib/opengl_es_command.rb though, this doesn't mean you can use OpenGL APIs on your environment.

samuelintheforest commented 3 years ago

Yeah I'm running on Raspberry PI (supports opengl es 3.1)

glGetProgramBinaryOES is supported I tested it with glxinfo

And the strange thing is when I execute glGetProgramiv with GL_COMPILE_STATUS as the function second parameter than I got back it correctly. Maybe there is something with GL_PROGRAM_BINARY_LENGTH_OES... (I tried to use only GL_PROGRAM_BINARY_LENGTH as well but it didn't work either...)

vaiorabbit commented 3 years ago

I ran the codes below with Windows/NVIDIA GeForce 2060 because I don't have Raspberry PI.

[Excerpt from ruby report_env.rb]

Version    : 4.5.0 NVIDIA 456.71
Vendor     : NVIDIA Corporation
Renderer   : GeForce RTX 2060 SUPER/PCIe/SSE2
Shader     : 4.50 NVIDIA

Firstly, could you check if your environment really supports binary shader?

def assert_no_error
  e = glGetError()
  if e != GL_NO_ERROR
    $stderr.puts "OpenGL error : #{gluErrorString(e)} (#{e})\n"
    exit
  end
end

# Make sure the driver supports binary format
assert_no_error()
binary_format = ' ' * 4
glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, binary_format);
pp binary_format.unpack1('l') # -> returned 1 on my environment
assert_no_error()

I can successfully save binary shader of sample/GLES/gles.rb by inserting the codes here:

  binary_length = ' ' * 4
  binary_format = ' ' * 4
  glGetProgramiv(prog_handle, GL_PROGRAM_BINARY_LENGTH, binary_length)
  assert_no_error()
  binary_length = binary_length.unpack1('l')
  pp binary_length # -> 7409

  binary = ' ' * binary_length
  glGetProgramBinary(prog_handle, binary_length.to_i, nil, binary_format, binary)
  assert_no_error()
  File.open('shader.bin', 'wb') do |file|
    file.puts binary
  end

One strange thing is why you are only writing the very first element of binary array into shader.bin. Could you check if this is what you want to do?

  file = File.new( 'shader.bin' , 'w')
  file.puts binary.unpack('l')[0] # <- file.puts binary  ?
  file.close  
samuelintheforest commented 3 years ago

uhm.. yea.. My computer doesn't support binary formats 😁 thanks for the help. (GL_NUM_PROGRAM_BINARY_FORMATS returned 0..)