erichlof / THREE.js-PathTracing-Renderer

Real-time PathTracing with global illumination and progressive rendering, all on top of the Three.js WebGL framework. Click here for Live Demo: https://erichlof.github.io/THREE.js-PathTracing-Renderer/Geometry_Showcase.html
Creative Commons Zero v1.0 Universal
1.92k stars 178 forks source link

feature-request: Render to surface texture to generate object surface textures #3

Open PhilAndrew opened 7 years ago

PhilAndrew commented 7 years ago

If you set the camera to be a 2D grid of rays above a surface on one side of the surface then you can generate the surface texture data and perform monte-carlo on this data. Then you can build up a surface.

For a scene which is static but the player moves around in it, then the surface texture can be generated accurately and then it doesn't need to be computed in real-time later once its computed.

Here is an example I produced here which does path tracing and renders textures using Play Canvas. Then my suggestion would be to allow this to scale out, save surface textures back up to a server for caching so they don't need to be computed again. Different client browsers can compute different surfaces.

Here is that project on PlayCanvas to play here https://playcanvas.com/editor/scene/492862/launch edit here https://playcanvas.com/project/453830/overview/surface-shader You need to spin the mouse wheel to zoom in and rotate the view to see inside.

image

erichlof commented 7 years ago

Hi @PhilAndrew , I tried the link you posted but it makes my browser lose its WebGL context and the screen goes black. I'm on a 2014 laptop which might be part of the problem, although it is what I develop everything on in this GitHub repo. Could you post a lighter example?

But in any case, thanks for the suggestion. Just want to make sure I understand correctly: Let's say I wanted a rock texture: Would it be like placing the camera directly over a gray-scale height map, looking down, kind of like a satellite view of a mountain range on Google Maps? Then trace rays directly downwards? I can envision getting a good first intersection (or 't' distance along the primary ray to the surface), but what about secondary ray bounces to capture global illumination/ indirect lighting? I can't envision the algo/math to shoot sideways along a height field texture in search of a second intersection point, once we're sitting on the first intersection point.

Maybe it's in the code you posted but I just couldn't see it. The pic looks great though, thanks for sharing! -Erich

PhilAndrew commented 7 years ago

Firstly, just show you some more pics. So you can see that from the outside the texture mapping is wrong and a bit crazy but inside the texture mapping is correct. The object can be rotated and on my computer it only takes about 5 seconds to settle down to a nice texture. So ok I'm on a relatively high end graphics card on a relatively new PC.

image

image