Open RenderTool opened 4 years ago
Hello @q750831855 Thank you for the picture in your issue - it helped me better understand what the problem is. The cause of the white noisy spots is that the SUN_DIRECTION vec3 variable was hard coded in the shader, HDRI_Environment_Fragment.glsl and was meant to be used with the symmetrical_garden_2k.hdr environment. When you changed the path to kiara_5_noon_2k.hdr, it loads fine, but the exact location of the sun direction vector has changed also and the old hard coded value for the symmetrical garden is no longer valid.
On all diffuse and clearcoat objects such as the ground and white plastic sphere on the glass table, there is a direct light sample towards the sun (sometimes called a shadow ray), and also a GI bounce gather from a randomly chosen direction that gives realistic GI color bleeding from nearby surfaces. What is happening is that when you tried to sample the new environment, it was shooting sun sample rays towards a sun that is no longer there, thus giving a dark ground to start with. Then when one of the GI random diffuse bounces just happens to hit the sun, it reports a super bright white pixel (a small image of the sun). Since it is a purely random hit, the bright spots appear as noise over the dark ground that wasn't able to find the sun on the direct sun sample.
To correct this, the exact 3d direction of the sun for that particular file needs to be found and set as the SUN_DIRECTION vector near the top of the glsl shader. To assist in finding this seemingly arbitrary sun direction for any environment, I created and commented out a large red metal cylinder that points in the direction that it thinks the sun is located. You can use this as an aiming device to zero-in on the sun location. You'll know when you have found it - the ground will immediately turn a bright white nearly uniformly - no gaps in sunlight.
Here is a link to a gist with the corrected SUN_DIRECTION vector defined near the beginning of the glsl fragment shader: corrected fragment shader
Try this out with the Kiara noon and it will be correct. Unfortunately the only way to find the sunlight direction in each environment is to use the red cylinder and trial and error. It would be nice if the creators of these HDRs would have a readme file included that said what the actual location of the sun is in each file. But that's the only way I know how as of now, and they're free to use - so I can't really complain.
And btw, the reason this didn't happen on older versions of my HDRI demo was that I was only using the random GI bounce for all the lighting and I was trying to down-weight when a ray happened to hit the sun. This worked ok I guess, but it wasn't as realistically bright in sunlight, and it took a long time for the ground to converge. Now with the new direct SUN_DIRECTION light sample, diffuse surfaces like the ground converge almost instantly!
Hope this helps, best of luck! -Erich
Hello @erichlof It seems that this method is only applicable to images with a sun. What if I don't have a sun on my envmap?
Hello @Vondila
Unfortunately this is one of the areas of HDRI environments that I haven't figured out yet. When I was first learning about using an HDRI map for the environment background and lighting, I happened to choose outdoor images that had some sort of sunlight in them. Admittedly, on the HDRI Haven free website where I downloaded the ones to start learning with, these where the first to pop up near the top of the website, so I just started with those (ha).
Right away, even with a definite singular sun light source in plain view, I ran into the same problem as the original poster who opened up this issue. I had sun 'fireflies' all over the place ('white noise', as he called it) , because I was just randomly sampling the hemisphere oriented around the surface normal of all the scene surfaces. Some sample rays would randomly escape to the background (t == INFINITY in my shader code), and they would look fine, collecting whatever background color or ambient sky/environment lighting was there. However, about 1 out of 1000 rays would happen to hit the sun disk just right, and report back a really bright sun pixel among their darker 'normal' neighbor pixels, hence the white firefly noise.
I am embarrassed to say that I couldn't even figure out how to convert the sun disk's occupied texture pixels to a 3d SUN_DIRECTION vector. I'm sure it's possible, but I haven't found a math resource/tutorial for doing such. So rather than giving up on HDRIs altogether, I resorted to a simple brute force empirical method to find the sun direction vector. I created a test red metal cylinder that extends infinitely (pictured above in my reply to the o.p.), and I literally just kept trying different x,y,z direction vectors until the red cylinder found the sun disk (lol). Then I wrote down that winning combination of vector components and called that SUN_DIRECTION in the shader. Now all the diffuse and clearCoat diffuse surfaces could send the bulk of their hemisphere sampling rays toward a definite target. Then if they successfully found the sun (unless they are blocked by another object and thus in shadow), I compensated for the extra direct light samples and down-weighted their contribution, as we must always do with anything Monte Carlo related. This works great for anything outdoors.
Now with your target image, which is completely indoors, there are 2 additional problems: the first is that there is not a singular dominant light source, like a bright ceiling quad light or large spherical light bulb, but rather many different tubular lights. Although each of these can be sampled using a long rectangular light shape (SampleQuadLight in the PathTracingCommon library file, which would give satisfactory results even with this rough thin quad shape approximation), to get decent results we would need to decide how to spread out the samples among the various lights. This is trivial if each light source puts out the same power and is roughly the same size, but with an arbitrary image, the light dimensions and power might not be known ahead of time, even by the image content creator.
The second problem is that when we are given a rectangular-shaped HDRI image, when we stretch it out around the scene, where are all the quad lights now? And they are not a simple circle shape like the sun in the easier outdoor scenes. Even if we could figure out the texel area covered by each light (I guess by finding the UV texture coordinates that correspond), I don't know how to re-project that light's thin rectangle area onto a sphere that can be turned into diffuse sample rays that head right for the lights' area when the image is stretched around in the final 360 environment.
Again, I'm sure all of this is possible, because we have all seen renderings using these types of indoor HDRIs - I'm just not sure where to begin with the math required (which I imagine would be quite tricky with these non-directional area lights). My only suggestion for you and others wanting to get white-noise-free renderings with indoor images at this point is, you might be able to do the brute force method like I did for outdoor scenes: namely, create thin white quads and render them as extra objects like I did with the red metal cylinder on the outdoor scenes. Then render with the HDRI image in the background and try to line them up as best you can so that they somewhat cover the general area of the image's tubular lights. Once you have that in place, the rest is a piece of cake: just randomly pick 1 thin quad to sample each animation frame, sample that with the direct lighting algo (used on all my demos with quad lights all over this repo), then multiply the chosen light's contribution by the total number of quad lights to compensate for the fact that we missed out on all the other lights for that animation frame. This helps the frame rate stay at 60 FPS and keeps the final image as unbiased as possible (with our admitted approximations), and also converges quite quickly once all the lights have been located and also have a fair shot at being chosen.
Sorry I couldn't be of more algo/mathematical help on this shortcoming of both the renderer and of my knowledge. If there's a resource out there about how to do this exact thing, maybe I can learn from it someday and incorporate it into the renderer so we all can benefit!
Let me know how it goes if you try my wacky brute force empirical method! :-) Best of luck, -Erich
@knightcrawler25
Hello! Thank you so much for the references and the links to your implementation! This will be super helpful. I had forgotten that the PBR book had an explanation of their method. I'm going to read up on this right now!
Thanks again!
Hello to all involved with this thread (and those who may have just found it for the first time),
I am back with good news and promising steps forward in combating this shortcoming (of not being able to mathematically locate the sun (or arbitrary) light source(s) once the HDRI has been wrapped around the path traced scene in a 360-degree spherical fashion).
Thanks to some of the info in the helpful resources linked to by @knightcrawler25 , I have taken the 1st step towards my understanding of how to, first of all, locate the sun light source in an outdoor environment HDRI, and then convert that location into something that the GPU pathtracer can understand when it is sampling direct sunlight for diffuse and clearCoat diffuse surfaces in the scene.
If you look at the updated HDRI_Environment.js code, I have a TODO in place to come up with a function that takes the newly loaded HDRI image and iterates over every pixel, looking for the brightest (and therefore most important) pixels and making a note of their x,y location. Until that nifty future function is implemented, in the meantime I had to resort to just a tad of naive brute-force pixel locating strategy (lol) by opening up the HDRI in PhotoShop, then clicking on the middle of the Sun with the eyedropper tool, then noting the x,y pixel location. But that's the only naive brute-force thing left to fix, I promise! (ha).
Once I had the sun's center pixel location, I devised the following plan to successfully retrieve a 3D world direction vector that points towards the center of the sun inside the 3D pathtraced scene:
HDRI image: symmetrical_garden_2K image dimensions: (2048 x 1024) Central sun pixel location: (396, 174) normalize to get texture (u,v) coordinates (floating point in the 0.0-1.0 range): (396 / 2048, 174 / 1024) = (0.193359375, 0.169921875)
map bright-light texture location(u,v) coordinates to Spherical Coordinates (phi, theta): (phi, theta) = (v PI, u 2PI) = (0.169921875 PI, 0.193359375 2 * PI)
using three.js math library, convert Spherical Coordinates into 3D Cartesian coordinates (x,y,z): let lightTargetVector = new THREE.Vector3(); lightTargetVector.setFromSphericalCoords(1, phi, theta); // 1 means radius, so a unit sphere with a radius of 1
resulting sun 3D direction vector in world space (x must be negated, due to three.js' R-handed coordinate system, I think) x: -0.47694634304269057, y: 0.8608669386377673, z: 0.17728592673600096
So just looking at the above result sun direction vector, we can see that it points left (-x), up (+y) and just a hair behind the center of a 0 z direction, +z: 0.177... And lo and behold, if we stretch the HDRI around the scene in a 360 degree fashion, the surfaces all successfully find the sun to efficiently sample from:
Now if you don't see a difference between this demo and the old method's demo, that's actually a good thing because it means my newly learned math is holding together (lol). Also as an interesting side-note, I printed out the sun direction vector from the old naive way of me using the infinite red metal cylinder to dumbly find the sun, vs. the new mathematically-sound robust method. Here are the results: My old naive eyeing method: SunDirection = (x: -0.4776591005752502, y: 0.8606470280635138, z: 0.17643264075302031) New, robust mathematical method: SunDirection = (x: -0.4769463430426905, y: 0.8608669386377673, z: 0.17728592673600096)
We're talkin' hundredths of a degree here: not bad for my old eyeballs, ha ha! But seriously, this is good because it proves that we can just (in the near future) load in an arbitrary HDRI image, locate the sun's central pixels, and then convert that into something that three.js can use and then ultimately something that the GPU pathtracer can use.
Although this first step is promising, I have a ways to go in order to be able to record and importance-sample HDRIs containing many arbitrary light sources other than a single sun disk. I have to really grasp something visually (like I outlined in the algo above) before I can start coding it up and placing it in the renderer as a new feature. I think I can get there eventually, but it might take some more time - the linked multiple arbitrary lights sampling math looks pretty hairy, and I'm no mathematician by any means, ha. But hopefully we'll get there!
-Erich
@q750831855 and @Vondila
Good news! I implemented the previously-mentioned loop over the hdr texture image pixel data to find the brightest pixels. That means that this issue has been fixed for all outdoor scenes with some sort of sun visible, or partly cloudy-covered Sun. I successfully got rid of all the hard-coded sun vector directions (red metal debug cylinder) and hard-coded bright pixel locations (inside PhotoShop) from earlier attempts. You'll see that I heavily commented the new, better approach and its code inside the updated HDRI_Environment.js file. Now you should be able to load in any arbitrary outdoor scene and get automatic natural light source finding and therefore, good noise reduction. Yay!
Still TODO is finding bright spots for scenes (especially indoors) with artificial arbitrary lighting and multiple light sources. I have an idea how I'm going to tackle this, but it might take some more time. As mentioned previously, I really have to dive in and 'swim' around in the numbers and algos and be able to see the overall problem, before I can start typing code to be placed in the renderer here on the repo. So, please be patient - but everything is looking positive so far, baby steps. I just need to take the robustness to the next level now!
@erichlof Here is a description of infinite area lights,I hope this is helpful :) http://www.pbr-book.org/3ed-2018/Light_Transport_I_Surface_Reflection/Sampling_Light_Sources.html#InfiniteAreaLights
https://github.com/hoverinc/ray-tracing-renderer/blob/master/src/renderer/glsl/chunks/envMap.glsl#L1
Hello!When i use the HDRI_Environment.html(threejs version r110),I tried to change the HDRI path like kiara_5_noon_2k.hdr ,a lot of white noise appears in the scene.The problem did not occur in the previous version。 您好!当我使用您的 HDRI_Environment.html(threejsr110版本,把其中的HDR贴图路径改成了 kiara_5_noon_2k.hdr,像往常一样渲染发现场景中出现了大量的白色噪点。这个在之前的版本没有发生。