kylebarron / deck.gl-raster

deck.gl layers and WebGL modules for client-side satellite imagery analysis
https://kylebarron.dev/deck.gl-raster/
MIT License
85 stars 10 forks source link

Support 16bit bands #16

Closed kylebarron closed 4 years ago

kylebarron commented 4 years ago

There are a couple possibilities for supporting 16 bit raster bands.

  1. Raw uint16 bytes as a binary blob

  2. 16-bit PNG grayscale.

    Canvas methods don't support 16 bits, so you'd need to use a custom PNG parser, like UPNG.js. Additionally, the PNG spec defines multiple bytes to be in Big-endian order, while WebGL requires input data in system endianness, which is usually little-endian. So you'd have to copy the data into a new TypedArray, switching to little-endian.

    Pseudocode for switching to little-endian:

    const buffer = read image   
    const view = new DataView(buffer)
    const littleEndian = new UInt16Array(buffer.length / 2)
    for (var i = 0; i < buffer.length, i += 2) {
      const offset = i * 2;
      const value = view.getUInt16(offset)
      littleEndian[i] = value
    }
  3. Two 8-bit bands of an RGB PNG

    Have to combine bands on the GPU. StackOverflow.

Additionally, I couldn't get an array of UInt16s to work in WebGL. I think you need to switch some texture parameters. See 1 2. Some options might not work in WebGL 1.

kylebarron commented 4 years ago

Of course it doesn't work when you supply GL.HALF_FLOAT, because uint16 isn't a half float...

Instead you want to set the data type as GL.UNSIGNED_SHORT.

See WebGL docs here.

You probably want:

internalFormat: GL.R16UI
format: GL.RED
type: GL.UNSIGNED_SHORT

But that's probably WebGL2 only

kylebarron commented 4 years ago

https://stackoverflow.com/questions/57699322/webgl2-render-uint16array-to-canvas-as-an-image?noredirect=1&lq=1

kylebarron commented 4 years ago

So I got a bit of progress with:

  1. You can't have GL.LINEAR filters with uints

    const DEFAULT_TEXTURE_PARAMETERS = {
    };
  2. Make sure you're loading uint16 satellite images... If you load uint8 then you won't have enough bytes

  3. "Working" texture params. You need to specify height, width. Turned off mipmaps for now...

    const data = new Uint16Array(image);
    const texture = new Texture2D(gl, {
      width: 256,
      height: 256,
      data,
      parameters: DEFAULT_TEXTURE_PARAMETERS,
      // Colormaps are 10 pixels high
      // Load colormaps as RGB; all others as LUMINANCE
      // format: image && image.height === 10 ? GL.RGB : GL.LUMINANCE,
      // format: GL.RED_INTEGER,
      format: GL.R16UI,
      dataFormat: GL.RED_INTEGER,
      type: GL.UNSIGNED_SHORT,
      mipmaps: false
    });

But now my issue is that I need to switch to WebGL 2 syntax...

kylebarron commented 4 years ago

Check the viv shaders for a possibly helpful reference: https://github.com/hms-dbmi/viv/blob/master/src/layers/XRLayer/xr-layer-fragment.webgl2.glsl

Viv does look like it supports 16 bit data in shaders: https://github.com/hms-dbmi/viv/blob/603e5e0967eec1b360623dbe51357baa6bdf71fc/src/constants.js#L20-L26