gkjohnson / three-gpu-pathtracer

Path tracing renderer and utilities for three.js built on top of three-mesh-bvh.
https://gkjohnson.github.io/three-gpu-pathtracer/example/bundle/index.html
MIT License
1.32k stars 132 forks source link

Add light mapping / AO baking demo #5

Open gkjohnson opened 2 years ago

gkjohnson commented 2 years ago

UV Unwrapping

Lightmap Generation

  1. Optionally generate a UV map
  2. Adjust the path tracing shader to project into UV space rather than world space.
  3. Flood the texture gaps (use mip maps?)
  4. Apply to model
Hoodgail commented 2 years ago

When will this issue be closed and complete? 👀

gkjohnson commented 2 years ago

Depends on if and when someone wants to work on it. I have no plans to put time into this at the moment. But feel free to open a PR if you're interested in it to help it along and I can provide direction if needed.

gfodor commented 1 month ago

I may take a closer look at this next week. Given the issue was opened a long time ago, if there are any gotchas or other thoughts on how to do this properly (eg, if anything about the code has changed that should be taken into account) comments would be appreciated!

gkjohnson commented 1 month ago

That would be great @gfodor! The first step should be making a utility that can generate new UVs for set of geometry on a new channel using xatlas. I'm thinking something like this:

// create the generator and assign any options
const uvGenerator = new UVGenerator();
uvGenerator.channel = 2; // uv channel to generate uvs for

// function takes a geometry or array of geometries and generates and assigns
// a new set of UVs to the geometry.
uvGenerator.generate( geometries );

The architecture for WebGLPathTracer has changed quite a bit since this issue was made so I'll have to take a look to see what adjustments we can make to make it compatible with an AO map generation material. But it shouldn't be too hard and the above is a big first step.

To get ahead of it I think a good API for generating the light ao map would look like so:

// create the generator and assign options
const generator = new AOMapGenerator( renderer );
generator.channel = 2;                // uv channel to use for the ao map
generator.samples = 30;               // the number of samples to take
generator.floodMargins = true;        // whether to flood the margins of the ao map or not
generator.scene = scene;              // the scene to use for lighting

// generate a new AOMap for all passed geometries on the render target
generator.generate( geometries, renderTarget );

Happy to answer any other questions on this - I appreciate the help!

gfodor commented 1 month ago

Cool! Is the same method here going to be easy to extend to doing broader light maps (or is this the same thing entirely as doing AO maps?) The stretch goal here is I would like to gather light probes too, with the intent to try to replace the blender workflow here: https://github.com/gillesboisson/threejs-probes-test

gkjohnson commented 1 month ago

Is the same method here going to be easy to extend to doing broader light maps

My understanding of lightmaps is that structurally they're basically the same as AOMaps. They can use the same UV channels and UV layout, etc - they will just include light and color accumulated from more bounces and use proper material models. There is a question of whether the lightmap should include only secondary bounce light if it includes the primarily lighting as well - I guess it depends on how you want to light your scene with real time lights.

The stretch goal here is I would like to gather light probes too, with the intent to try to replace the blender workflow here: https://github.com/gillesboisson/threejs-probes-test

I'm less familiar with light probes and how they're generated but if you're familiar with it and can describe what you need at a high level I can try to provide some direction. But I think we can discuss this in another issue and focus on AO maps, first.

gfodor commented 1 month ago

Yep AO maps seem like a good place to start. By flipping the normals we can also generate thickness maps, which is useful for real time subsurface scattering.

gfodor commented 1 month ago

OK, managed to get this far. Need to flood fill things next.

image
gkjohnson commented 1 month ago

Nice! Looks great - not sure how you're planning to do mip flooding but this is a technique from SIGGRAPH a few years ago used in God of War that could speed things up over the typical pixel-by-pixel expansion:

https://www.artstation.com/blogs/se_carri/XOBq/the-god-of-war-texture-optimization-algorithm-mip-flooding

gfodor commented 4 weeks ago

OK, PR opened. This ended up being a lot more work than I expected, but it turned out well I think.

https://github.com/gkjohnson/three-gpu-pathtracer/pull/670

gkjohnson commented 2 weeks ago

An interesting idea around improving the look of lightmaps:

https://x.com/john_clayjohn/status/1825682852324979143

image