Closed dmnsgn closed 2 years ago
Allow texture internalFormat/type as props
WWTJSD? WWBJSD? WWFD? :D
From my understanding when using gl.compressedTexImage2D
the internalFormat
replaces format
and type
as type is always binary blob. This way you could say pixelFormat
is the internalFormat
and just have:
const TextureFormat = {
RGBA8UI: [gl.RGBA_INTEGER, DataType.Uint8],
...
COMPRESSED_RGB_S3TC_DXT1: [ext1.COMPRESSED_RGB_S3TC_DXT1_EXT],
...
COMPRESSED_RGBA_BPTC_UNORM: [ext2.COMPRESSED_RGBA_BPTC_UNORM_EXT]
...
}
But maybe it's not worth the effort because you would need to somehow map from magic int in ktx2 or gltf matching COMPRESSED_RGBA_BPTC_UNORM_EXT
to COMPRESSED_RGBA_BPTC_UNORM
(without _EXT
) in order for it to work.
Now that i think more about it. Those constants still have to be defined somewhere... texture loader? What about the fact that some of them are _EXT in WebGL1 but in core in WebGL2?
Is it okay to fallback to accepted data types in ctx.TextureFormat eg. in case gl.HALF_FLOAT is undefined
That very much depends on use case. If for render targets where you wanna save on your big RGBA32F memory usage and opt-in for RGBA16F it would be fine. But single channel textures? I'm still confused though. How does one uses e.g. R16F? MDN says there are not color renderable and last time i have checked there was no way to upload Half Float data in WebGL1.
Any code using ctx.PixelFormat.R32F should be updated to: ctx.texture2D({ pixelFormat: ctx.PixelFormat.Alpha, type: ctx.DataType.Float32 })
Because it's not compatible with WebGL2 usage of gl.RED
? Shouldn't that be handled internally?
R32F: [isWebGL2 ? gl.RED : gl.ALPHA, DataType.Float32],
Also it seems there is typo and ctx.PixelFormat.Alpha
should be ctx.gl.Alpha
?
Depth
, Depth16
, Depth24
are just so our code doesn't break right? As you suggest we use WebGL2 style uppercase for everything?
Do we need support for compressed texture for cubemaps?
Definitely now now. I haven't used cubemap texture for like 5 years outside of render targets. It could be useful in webgl2 once seameless cubemaps are supported are we implement some of the glTF extensions for IBL which i think use cubemaps.
What are those? WebGL1 vs WebGL2 mappings?
https://github.com/pex-gl/pex-context/blob/9f2867dc3c1c316e62a3ed3f83a12da58aef63de/index.js#L233
Will those more exotic formats be null in WebGL1?
Should i do those comments in commits not here? So they can be answered inline?
ctx.TextureFormat
as a mapping of[internalFormat]: [format, type],
to automatically infer it in ctx.texture (WebGL2 ready)ctx.DataType
to be used inctx.TextureFormat
with signed types (WebGL2 ready)internalFormat
s)ctx.capabilities.depthTexture
to:DEPTH_COMPONENT
instead ofDEPTH_COMPONENT16
(WEBGL_depth_texture only definesDEPTH_COMPONENT
andSTENCIL_COMPONENT
, notDEPTH_COMPONENT16
etc)DEPTH_COMPONENT
andSTENCIL_COMPONENT
mapping toctx.TextureFormat
premultiplyAlpha
typo,width
instead ofheight
andopts.wrap
order issuesTBD:
Allow texture internalFormat/type as props (for compressed texture to pass their extension defined
internalFormat
s): => Does it need to be an enum of supported compressed format instead? Or passing internalFormat is acceptable.Is it okay to fallback to accepted data types in
ctx.TextureFormat
eg. in case gl.HALF_FLOAT is undefined, fallback to gl.FLOAT (R16F: [gl.RED, DataType.Float16], // DataType.Float32
could beR16F: [gl.RED, DataType.Float16 || DataType.Float32]
)Any code using
ctx.PixelFormat.R32F
should be updated to:Do we need support for compressed texture for cubemaps?