kylebarron / deck.gl-raster

deck.gl layers and WebGL modules for client-side satellite imagery analysis
https://kylebarron.dev/deck.gl-raster/
MIT License
84 stars 8 forks source link

Dusting off and potential unification with viv #134

Open kylebarron opened 8 months ago

kylebarron commented 8 months ago

I've been informally chatting with @manzt recently about a generic deck.gl layer for visualizing and operating on image data that can be used both for geospatial and microscopy (and more!) use cases.

I've been working on a new project at @developmentseed called lonboard that connects deck.gl to Python for geospatial applications, and Development Seed does a ton of client work with raster data, so I may have some time to invest in a potential refreshed raster visualization layer.

The purpose of this issue is to see if there's potential for collaboration with viv folks on low-level multi-channel raster/image visualization that can be generic enough for many use cases.

There are 3 main parts to the existing deck.gl-raster layout (this is reflected in the directory layout): data loading, deck.gl layers, and webgl modules to generate shader pipelines. These should each be able to be fully composable and generic across geo/microscopy use cases.

Data Loading

deck.gl-raster's primary existing data loader is the NPYLoader, which loads the numpy file format.

Other possible loaders would include (Cloud-Optimized) (Geo)TIFF and Zarr (GeoZarr and OME-Zarr). For tiled sources, a generic deck.gl TileLayer may be used to render multiple RasterLayers.

Deck.gl Layer

Currently, this project has two layers, the RasterLayer and the RasterMeshLayer. The former extends the BitmapLayer and the latter is a combination of the RasterLayer and the MeshLayer to be used for geospatial terrain rendering. For the discussion at hand, we can ignore the RasterMeshLayer and focus on the RasterLayer.

The design of the RasterLayer was to add three props to the BitmapLayer:

The fragment shader of the RasterLayer adds two shader injection points, DECKGL_CREATE_COLOR and DECKGL_MUTATE_COLOR. These are injected here. These allow an image to be instantiated and then one or more modules to mutate color values (it's possible these should be joined into a single injection point).

Then the list of shader modules is assembled into a string that is injected here.

Shader modules

A variety of geospatial algorithms have already been implemented as gpu modules, see the webgl/ folder. These are relatively straightforward; they take a four-channel image as input and return some sort of array as output.

One very important thing about shader modules is that the output dimension of one stage needs to match the input dimension of the next stage. If you upload an RGB image and then apply a colormap, it'll silently ignore the G and B channels of the original image. If you apply something like Normalized Difference (e.g. for NDVI) and then don't apply a colormap, I think you'll see ranges of red, where the gpu interprets only as the first channel? Having better validation for this would be very helpful.

Pieces of work

Update to latest deck.gl

I haven't touched this code for a few years and I have no idea if it works with the latest deck.gl.

Conversion to TypeScript

deck.gl 8.9 supports typescript and would probably make the code a lot more maintainable.

More stable raster pipelines

As I alluded to in the shader modules section above, it's easy to shoot yourself in the foot when creating a raster pipeline.

Ensure compatibility with non-geo views

Presumably viv uses non-geospatial views. I haven't tested the existing RasterLayer in a non-geospatial use case, but I don't remember anything off-hand that would break.

Open Questions

cc @manzt and I'll let you cc anyone else who should be included.

ibgreen commented 8 months ago

Can we ignore WebGL 1 entirely at this point?

My vote is yes, it is time to ignore WebGL 1.

deck.gl v9 will not support WebGL 1, at least not initially. The effort to do support it will not be inconsiderable, and I suspect the community would rather see us exploit WebGPU's potential rather than spend out time on WebGL1. If some particular company needs WebGL1 maybe they can provide the resources or funding for it.

How much will deck v9's move to WebGPU-compatible shaders change the content of these layers?

We are talking about a completely new shader language, so there will be impact. But as support is currently being built out in luma.gl, I think luma could take requirements from this project into account. If we are able to write down a set of problems / requirements....

ibgreen commented 8 months ago

I am eager to join this effort. If a number of people are coming together around this, we could start a "raster working group" in open visualization. A roadmap. doc, slack channel, potentially set up a few meetings.

manzt commented 8 months ago

Thanks for opening the issue @kylebarron and kicking off the discussion!

I am eager to join this effort.

Great!

Other possible loaders would include (Cloud-Optimized) (Geo)TIFF and Zarr (GeoZarr and OME-Zarr). For tiled sources, a generic deck.gl TileLayer may be used to render multiple RasterLayers.

Much of this scope currently lives under @vivjs/loaders, specifically for Zarr and (OME-)TIFF. In general, that work has been towards unifying multiscale n-dim data sources. I'd be happy to take a lead there. It would be interesting to see if we could use these loaders as is currently with deck.gl-raster.

Update to latest deck.gl

Also something we are working towards in Viv, specifically would love for the layers code to migrate to TS as well. Perhaps a starting point could be to to unify around at "core" RasterLayer, with such a composable pipeline as you mentioned. We could see about replacing our XRLayer with such a primitive in Viv.