Closed chriswmackey closed 6 years ago
Hi @chriswmackey to quote the glossary from Rendering with Radiance..
"diffraction: The deviation from linear propagation that occurs when light passes a small object or opening. This phenomenon is significant only when the object or opening is on the order of the wavelength of light, between 380 and 780 nanometers for human vision. For this reason, diffraction effects are ignored in most rendering algorithms, since most modeled geometry is on a much larger scale."
In addition to the above, there is also an issue with Radiance being a three channel (RGB) system. The algorithms inside Radiance appear to be focused primarily on handling photopic calculations. The graininess you see in your simulation appears more to be an artifact of the simulation settings.
I have tested such simulations with forward-raytracing/photon mapping and have gotten better results.
@sariths ,
Thank you for the explanation and your reason for the graininess of the rendering makes sense. I think I must have never realized how small the slits of the double slit experiment are. Also, I must have really misinterpreted the applicability of light diffraction to everyday issues.
Is it safe to say, then, that the dappling of light and fuzziness of shadow definition that happens around the edges of tree branch shadows or other far away objects is not the result of light diffraction? Is it instead the result of another phenomena like light bounces happening between branches or leaves that diffuse the light slightly?
Finally, is it safe to also say that Radiance can model almost any lighting phenomena except underwater renderings AND light diffraction?
Thank you, again!
Hi @chriswmackey,
The main cause of fuzziness in shadows is due to multiple sources of light (primary source and their secondary derivatives). This effect can be demonstrated in Radiance ..(this has always been an afterthought to me, but I am going to get this to work in a test render and update it here soon).
I'd say that most things that we are interested in measuring from a photopic perspective can be simulated with Radiance. The reason why most lighting designers are interested in measurements at all is to meet regulations. These regulations are primarily concerned with brightness related quantities. Brightness strongly correlates to measured luminance (which, in turn, relates to illuminance). When it comes to performance-critical issues, both CIE and IES speak in terms of illuminance (for general lighting) and luminance (mostly for street and tunnel lighting).
Once we start thinking beyond illuminance and luminance, and consider quantities which can't be measured with simple hand-held measurement devices, simulations are not of much use. Here are some of the things that are typically not within the realm of possibilities with Radiance ( or AGI32, Dialux, Relux etc).
Since I started typing this list, a lot more things - that are based more on what I know about Color Science and Perception than photometry - started coming to my mind. I guess you get the idea that there are limitations to what Radiance or lighting simulations can do.. :) !
Applied Lighting and Daylighting is a fairly imperfect science. Here is something to consider:
@sariths ,
Thank you for all of the information and it took me some time to soak it all in. The explanation of fuzziness makes sense and I can clearly visualize now that much of the fuzziness that we see from shadows in sunlight results from the diffusion of light as it enters the atmosphere and the fact that not all sunlight is coming from a single point in the sky. This results in a "multiple light sources" effect, which Radiance IS modeling whenever I use a CIE sky. I'm fully willing to accept that the fuzziness in my rendering is an artifact of not having enough ambient divisions or something similar but I would venture a guess that it is at least in part due to the fact that Radiance is actually modeling this "multiple light sources" phenomena with the sunny CIE sky that I selected.
Also, this list of features that can not be simulated is a great resource. Because so many of the non-simulatable features have to do with color, this list is particularly important for people who use Radiance to inform material selection based on intended color. For example, my office has used Radiance in the past to select certain glass products that won't look "green" at times of the day when the sun hits it at a certain angle. While Radiance is better suited to this task than any other rendering engine, it seems that we still have a lot that has yet to be modeled in this realm of color.
Finally, just for the sake of clarification, the only illuminance and luminance phenomena that Radiance is not able to simulate are those specifically related to the wave-like properties of light (since Radiance is a ray-tracing engine, after all, which is placing certain particle-like assumptions onto light). These wavelike properties should include the diffraction of light in the double-slit experiment (and it's resulting distribution of illuminance on the opposite wall of the experiment) as well as the refraction of light INTO DIFFERENT COLORS through a prism or a medium like water (I clarify the point about colors because I know we have refractive indices on our glass materials to account for this).
Let me know if my understanding is correct here.
@chriswmackey I am hardly the Radiance expert to have a definitive opinion on this matter. However, based on what I have learned till now, I agree with you that with the exception of the wave-like properties of light, most of the other phenomena, especially from a geometric optics perspective, can be simulated with Radiance. As you recalled, I had mentioned during the workshop that Radiance can simulate "practically anything with the exception of underwater renderings." Well, that exception isn't so valid either... Here is an underwater rendering that I had done in Sep 2015 with the (then) new Photon Mapping extension: Here is an actual pool with underwater lights.
As you can observe, even in the hands of a novice user, Radiance does a pretty good job of replicating caustics. The caustics in my rendering would have been more realistic if I had used a more refined noise function for replicating the waves in water and modeled my luminaires using 3d solids instead of simple rectangles with brightdata
So, technically underwater lighting should be possible. I am just not sure if any validation studies have been conducted in that regard.
@sariths , This is very cool. It is good to know that underwater renderings might be already possible and it is just pending validation.
This one finally explained it all to me incredibly clearly:
You were so right, @sariths .and more so than I ever realized.
I realize that I am guilty here for posting an issue that really should have been a discussion on the forum. So I'm going to close out this issue and maybe we'll bring it back onto the forum at some point.
@sariths and @mostaphaRoudsari ,
I have to admit that I am asking this question while only having a tangential practical application in mind. Part of it is because I want to know if Radiance is accounting for the "dappling" or diffraction of light around some louvers on a project at my office, which seems to be the case from these high quality renderings:
The other reason, though, is that this question has been bugging me intellectually over the past few weeks after I watched several documentaries on Quantum Physics. Mainly, I want to know, is Radiance capable of simulating the double-slit experiment, which was the original basis for the argument that light behaves like a wave. If so, it would be great to have an explanation for how Radiance models this because it seems like something that might not be accommodated by simple ray-tracing.
I remember that you mentioned in our workshop, @sariths , that Radiance can simulate "practically anything with the exception of underwater renderings." Does this "practically anything" include the diffraction of light?
Thank you, as always, great gurus of Radiance, -Chris