erichlof / THREE.js-PathTracing-Renderer

Real-time PathTracing with global illumination and progressive rendering, all on top of the Three.js WebGL framework. Click here for Live Demo: https://erichlof.github.io/THREE.js-PathTracing-Renderer/Geometry_Showcase.html
Creative Commons Zero v1.0 Universal
1.91k stars 177 forks source link

feature-request: Smooth monte-carlo #2

Open PhilAndrew opened 7 years ago

PhilAndrew commented 7 years ago

There are some ways to make the monte-carlo result look nicer in a shorter amount of time, one I see is here. https://benedikt-bitterli.me/nfor/ I saw another on Shadertoy but I can't find it just now, will look later and try to find it, it was a different method. It would be nice for an image which is not moving to have it smooth faster.

Also it occurred to me that a trained neural network would likely give a good solution for smoothing monte-carlo.

erichlof commented 7 years ago

Hi @PhilAndrew Thank you for the link to that paper! I will definitely check it out; they even have some pseudo-code on the website. All too often, I get excited about this paper or that, only to get to the end of the document and not know how to translate the math and abstractions into real glsl shader code.

If you find that Shadertoy monte-carlo example, please post a link. Something like that I could definitely try because it is already in a language I deal with on a daily basis. :)

One caveat with the monte carlo smoothing is that the interactivity decreases while in 'pretty' mode, and when getting out of 'pretty' mode. For example, say I'm flying the first person camera around, everything is 30-60 fps, no problem. Then I stop to inspect an object, ok fine - now we go into photo-realistic high quality mode. My experience at trying naive algorithms such as throwing more samples at the screen, negatively impacts the framerate (understandably so - we were doing 4 bounce depth1spp =4 bounces a frame, now we want 48spp=32 bounces or more a frame). To some degree, this is how all the major path tracers/ renderers do it. So the frame rate drops to 15 or 10fps. So far so good, you wouldn't notice the frame rate drop if you had an indoor static scene. But once you want to break away from this view and continue flying the camera, the time it takes from your initial mouse move to break out of pretty mode and head into real-time flying mode again was annoying and at times, unbearable for me. The interactivity as a whole seemed to go out the window.

One of the goals of this project is to keep the frame rate as high as possible. My dream would be something like Brigade 2 or Brigade 3 by Otoy, rather than Cycles or Octane, if that makes sense (although I'll probably never get there because I'm working inside WebGL and it's just me, a hobbyist, but it's fun to dream! lol) .

Again, thanks for the paper link and please let me know if you find that Shadertoy example! -Erich

PhilAndrew commented 7 years ago

What you did is surprisingly fast anyway, I mean compared to shadertoy's, it seems a little faster than those.

PhilAndrew commented 7 years ago

Here are the links:

Not a shadertoy but maybe interesting. http://dev.ipol.im/~mdelbra/rhf/

For volumetric https://www.shadertoy.com/view/ldXGzS

Direct Light using MIS For this one, hold the mouse down on the image being rendered to see the left and right. https://www.shadertoy.com/view/4sSXWt

Also there's always new papers each day... like I found on my twitter today http://graphics.cs.williams.edu/papers/PhotonI3D13/Mara13Photon.pdf

erichlof commented 7 years ago

Hi @PhilAndrew

Thanks so much for those links! I was aware of the volumetric example by sjb on ShaderToy because I had borrowed some of sjb's code for sampling spherical and point lights in a volume: https://www.shadertoy.com/view/Xdf3zB The example you linked to implements the same technique but with rectangle area lights in a volume - something I'm trying to implement soon.

I had also seen the MIS ShaderToy example, and I really like the look of it. However, Eric Veach's original math/algo's and the ShaderToy implementation are pretty hefty and a little over my head. I might revisit it though, because it is a very robust solution to all materials/lighting situations.

I will definitely check out the rhf(ray histogram fusion) paper; it even has source code (yay!).
And the real-time Photon Mapping looks interesting, although I have never really looked into Photon Mapping before. My initial impression a while ago was that it had some hefty memory requirements and start up times. But I'll check this new paper out.

Regarding convergence speed, before I added Direct Light sampling, the convergence was painfully slow. After adding Direct Lighting (which some of the ShaderToy examples don't really implement), the convergence speed rocketed! Diffuse materials converge almost instantly. Still, the remaining bottleneck for convergence speed is bright caustics shining on diffuse surfaces (Bright light to mirror to wallpaper, or Bright light through glass sphere to floor). But maybe some of the papers you linked to can help mitigate these issues.

Yes there seems to be path tracing papers every week! As our computers/graphics cards are getting more capable these days (heck my cell phone can run all the examples on this GitHub repo!), I think real-time demo/games artists are going to slowly move away from the sometimes limited/hacky rasterization pipeline to the path tracing pipeline.

Thanks again for the links! -Erich

PhilAndrew commented 7 years ago

https://twitter.com/morgan3d/status/865728937527267346

New neural network Monte Carlo smoother

PhilAndrew commented 7 years ago

Maybe useful

https://twitter.com/tunabrain/status/872174108385136640

bit2shift commented 7 years ago

You should take a look at Metropolis Light Transport before trying to find "fancy" ways of denoising the image.

erichlof commented 7 years ago

Hi @bit2shift

I recently implemented Bi-Directional Path Tracing (from Eric Veach, the same author who created Metropolis Light Transport). Check out the new BiDirectional Demo . However, I couldn't implement his full algorithm because A. It was designed on non-realtime CPU renderers back in the 90's and has higher memory demands, and B. I couldn't quite wrap my brain around his "path weighting" details.

So far as Metropolis Light Transport is concerned, part of the problem with even getting started with this technique is that it requires a full BiDirectional path tracing pass do be done first and then uses this data as the starting point to mutate the paths for the Metropolis portion. Since I don't even know how to implement the full BiDirectional algo as outlined by Veach, I can't begin with the Metropolis algo. Plus the Metropolis path mutating and weighting is very involved and too hard for me to grasp at this point.

Thank you for the suggestion though! I might be able to incorporate some of the over-arching ideas into my GPU realtime renderer. :-)

PhilAndrew commented 6 years ago

https://cs.dartmouth.edu/~wjarosz/publications/mara17towards.html

FishOrBear commented 6 years ago

https://www.shadertoy.com/view/MsXfz4

erichlof commented 6 years ago

@FishOrBear Thanks for the link! That shadertoy example crashes my computer though. I'll just post this reduced sample link by the same author in case others have crashing problems. This one seems to work on more computers: Alt. ShaderToy Link

FishOrBear commented 6 years ago

https://www.shadertoy.com/view/XlXfDs

@erichlof

This is the latest example, converging in a very short period of time.

I try to use the iphone to access the page, it can not run, a simple example can run, iOs may be the problem.

I am running fine on windows10 chrome 64.