Closed j4m3z0r closed 6 years ago
I managed to get this working by changing the TexImage2D line to this:
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba16Ext, width, height, 0,
PixelFormat.Rgba, pixelType, data);
So I'll go ahead and close this. It seems like the original formulation should have worked, but since there's a path forward I won't cause any more noise here.
Hi there, I'm trying to make use of 16 bit RGBA textures (16 bits per channel -- 64 bits total, not 16 bits total), but I'm getting InvalidOperation when calling glTexImage2D. This is on the current version you have published on nuget, and on a machine that (I think) should support feature level 11.1 (Nvidia 960M/Intel HD530 GPUs). I'm initializing the EGL context with the client version 3 attribute:
The code I'm using here is based on the included templates, but ported to C#. I had expected this to work since I thought GLES3 included 16 bit textures. For reference, this is the line that loads the texture:
Is this expected behavior? Or have I missed something in initializing the GL context somehow? Apologies for treating Issues like a support forum -- wasn't sure where else to ask, and am not sure if the bug is mine or not. Thanks for your help!