gkjohnson / three-gpu-pathtracer

Path tracing renderer and utilities for three.js built on top of three-mesh-bvh.
https://gkjohnson.github.io/three-gpu-pathtracer/example/bundle/index.html
MIT License
1.29k stars 126 forks source link

Add support for vertex colors #182

Closed gkjohnson closed 1 year ago

gkjohnson commented 2 years ago

Requires packing vertex colors into a texture and multiplying it into the albedo (similar to normals, uv data textures).

cc @alvinalexander not sure if this is something that might be interesting to you?

alvinalexander commented 2 years ago

Sure, I can take a look at this!

alvinalexander commented 2 years ago

@gkjohnson This might be a basic question, but how exactly should vertex colors influence the albedo? Right now albedo is set to material.color * material.map. Is vertex color just another multiplicative factor? If you have any resources, I would like to get deeper understanding of albedo in general.

https://github.com/gkjohnson/three-gpu-pathtracer/blob/75024a97bb7b9e0d58bc401e3bbc8a910797c4f2/src/materials/PhysicalPathTracingMaterial.js#L207-L214

gkjohnson commented 2 years ago

Vertex color is just multiplied into the albedo but only if material.vertexColors === true

alvinalexander commented 2 years ago

What is the best way to upload the vertex color data to the shader? My plan was to read the color buffer from the geometry object returned by three-mesh-bvh and add a new uniform colorAttribute to PhysicalPathTracingMaterial.js. However, it seems geometry.attributes.color is undefined, even when I load a model with vertex colors. https://github.com/gkjohnson/three-gpu-pathtracer/blob/2852e9e8dacb311d6288197abf1114275db65d53/example/index.js#L675-L685 vertex-color-cube.glb.zip

gkjohnson commented 2 years ago

@alvinalexander it actually looks like the vertex colors get dropped when merging geometry right now. When prepping geometry for upload we merge it all into one mesh which also means ensuring that all the merged geometries have the same attributes.

So when merging meshes in PathTracingSceneGenerator and DynamicPathTracingSceneGenerator we need to include color in the attributes that are going to be retained in the final geometry.

The mergeMeshes function also calls setCommonAttributes which ensures that all geometry includes a set of attributes and fills in sensible data if it does not. So if no color attributes are available on the geometry then we'll want to create a new buffer of all zeroes.

That's why the color data is missing anyway - hopefully that all makes sense! Once the color data is available you'll want to generate a texture for use in the shader similar to the ones you've linked to.

alvinalexander commented 1 year ago

Thanks @gkjohnson

I made the changes you suggested, and I now see the color data as part of the bvh generator output. The data is stored as as Uint16Array, so I added a new uniform (colorAttribute) to the PathTracingMaterial and initialized it to UIntVertexAttributeTexture. Now I would like to sample the color values using barycentric coordinates, but I am a bit confused by the texture format vs sampler method. From what I can tell, the texture format is RBGA16UI, but it looks like you normalize the values to 0-1 range here https://github.com/gkjohnson/three-mesh-bvh/blob/5b2b27f30dfd3a55ce5b05a7fcb63fac806eac3e/src/gpu/VertexAttributeTexture.js#L186-L189.

Is there a way to cast the texture to float in order to use the existing textureSampleBarycoord method?

gkjohnson commented 1 year ago

Awesome!

Is there a way to cast the texture to float in order to use the existing textureSampleBarycoord method?

As far as I know there's no way to use a Uint16 array and get normalized values on the GPU. I would use a Uint8 array, intead. And then rather than using the UIntVertexAttributeTexture type you should use the FloatVertexAttributeTexture texture. The same array format will be used but when uploaded to the GPU it will be a usual sampler with floating point [0, 1] values.

gkjohnson commented 1 year ago

@alvinalexander not sure if you saw the issue but I just merged #244 which should have freed up an extra texture slot (we were at the max) to make room for the vertex color buffer texture!

alvinalexander commented 1 year ago

@gkjohnson Thanks for pointing this out. I haven't had time to work on this for while, but I'll take a look this weekend.

Btw, I know we discussed using Uint8 Arrays for vertex colors. However it seems different exporter might use different array types for encoding vertex color data. For instance, when I export a cube with vertex colors as gltf from blender, the colors are stored in a Uint16 buffer. So should we ignore vertex colors that are not encoded using Uint8?

I don't know much about color management, but I assume different exporters might use different color encoding schemes (sRGB + uint8, linear + uint16, etc.) and handling these will require different code paths.

Let me know what you think

gkjohnson commented 1 year ago

Yeah it's a good point - one issue is that three.js has not been very strict about vertex color space previously. Depending on the file format three.js might import vertex colors as linear or srgb and as uint8 or uint16. @donmccurdy - any recommendations or thoughts here?

I think you're right about using a larger precision format, though. Maybe using a uint16 or a Float texture would be best. Or in the future we can use a HALF_FLOAT tex. It might be worth looking at what the two vertex color demos in the model-viewer tests use to figure out the right format to use:

This is turning out to be a fairly in depth addition! Thanks for digging into it!

donmccurdy commented 1 year ago

Vertex colors in three.js are assumed to be in the working color space, and we don't apply conversions internally. Unless you're being very careful to do color management a different way manually, this would mean the vertex colors are Linear-sRGB. Similarly, vertex colors in glTF are always Linear-sRGB.

Using Linear-sRGB does mean that 8 bits may not be enough precision in some cases, particularly in darker colors. I'd think 16 bits would be more than enough for Linear-sRGB vertex colors.

You are probably fine using a uint8 sRGB texture, even if the original vertex colors were uint16. Adding sRGB encoding is buying you extra precision. In practice I don't see people reaching for higher-precision sRGB albedo textures often on the web, so I doubt there's any reason to give the vertex colors higher precision than the albedo map.

alvinalexander commented 1 year ago

Thanks for the response @donmccurdy .

So just to check my understanding. We assume that the vertex colors stored in the model file are in 16 bit Linear-sRGB color space. We then upload these colors to the GPU as a uint8 sRGB texture. The GPU will apply the sRBG encoding and normalize the values. Finally the values can be sampled using textureSampleBarycoord ?

@gkjohnson does the UIntVertextAttributeTexture support sRGB encoding?

gkjohnson commented 1 year ago

The GPU will apply the sRBG encoding and normalize the values. Finally the values can be sampled using textureSampleBarycoord ... does the UIntVertextAttributeTexture support sRGB encoding?

There's no need to apply the sRGB encoding to the texture itself. All of the path tracing operations happen in linear color space and are output in linear color space. The conversion to sRGB only happens right before the buffer is displayed to the screen.

And in terms of storing the colors to texture I think using the FloatVertexAttributeTexture would be the simplest place to start. And then we can deal with optimizing memory usage later.