mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
100.36k stars 35.2k forks source link

Light Probe interpolation using Tetrahedral Tesselations #16228

Open bhouston opened 5 years ago

bhouston commented 5 years ago
Description of the problem

Now that we have a LightProbe class (https://github.com/mrdoob/three.js/pull/16191), the Spherical harmonic class (https://github.com/mrdoob/three.js/pull/16187) merged as well as some shader support(https://github.com/mrdoob/three.js/pull/16152), we should explore adding a Tetrahedral tesselation method for interpolating between light probes to the renderer so that it can set the 4 SH + weights per object.

The best reference is this presentation from Unity itself:

https://gdcvault.com/play/1015312/Light-Probe-Interpolation-Using-Tetrahedral

(Referenced from here: https://docs.unity3d.com/Manual/LightProbes-TechnicalInformation.html)

It seems that given a set of light probes you just do a 3D delaunay tetrahedralization and use 3D barycentric coordinates for interpolation and you cache which tetrahedral each object is in to speed up lookups. Pretty simple, just need a standard delaunay algorithm implemented along with a searchable tetrahedral data structure.

/ping @WestLangley @donmccurdy @richardmonette

donmccurdy commented 5 years ago

Overlaps a bit with my comment in https://github.com/mrdoob/three.js/pull/16223#issuecomment-482658595, but should objects opt in/out for lookups in the light probe volume? For example, static terrain should not be affected. A material-level option, e.g. material.useLightProbes=true/false would be sufficient.

bhouston commented 5 years ago

I think that first one should enable ".dynamicGI" mode on WebGLRenderer, this builds your LightProbe tetrahedralization, etc and keeps it maintained. And then you enable ".dynamicGI" on individual objects to have them opt into the lighting effects that result from this. I prefer it on an object basis rather than on a material basis, it just makes more sense that way. But I guess that may be hard given the Three.JS object model. If it is on a per-material basis then yeah, it probably is .lightProbes = null or .lightProbes = [ a structure of an SH array and a weight array ]. Thus it would behave like .map = null or not. Rather than having two separate variables controlling it.

donmccurdy commented 5 years ago

I prefer it on an object basis rather than on a material basis, it just makes more sense that way. But I guess that may be hard given the Three.JS object model.

Yes, I conceptually like this on an object basis but wasn't sure how that would work. Perhaps it won't be too difficult.

donmccurdy commented 5 years ago

It seems that given a set of light probes you just do a 3D delaunay tetrahedralization ... Pretty simple, just need a standard delaunay algorithm implemented along with a searchable tetrahedral data structure.

This sounded simple enough, but after spending some time on every 3D Delaunay JS library I can find...

... I'm not satisfied with the results:

delaunay-triangulate incremental-delaunay darkskyapp/delaunay
delaunay-triang bug AC536B6C-18B7-4AF4-B8E9-8C6FBC89F748-16387-0000203F58568374

Note the near-intersecting lines crossing far corners of each volume, which I think indicate intersecting tetrahedra in the volume. For a simple cube 2x2x2 probe volume, the algorithms are generating 9-10 tetrahedra where 5-6 would be expected.

Referring back to Light probe interpolation using tetrahedral tessellations,

  • Bowyer-Watson seems to be the algorithm of choice
  • The method of finding the convex hull one dimension higher might be elegant and universal, but definitely not practical above 2D
  • Numerically robust implementation of the incircle and orientation tests available from [JShewchuk]
  • If you need a ready solution, [TetGen] by Hang Si is very decent and has some additional, potentially useful functionality, like tetrahedral mesh refinement

I cannot find a JS implementation of Bowyer-Watson. If Tetgen, a C++ implementation, is any indication, it is complex – tetgen.cxx is 35,000 lines of code. I'm mildly curious about the feasibility of compiling Tetgen to WASM, but would like to take a step back before putting time into something like that. In the interest of baby steps, here are some alternatives:

(a) Support a 2D Delaunay triangulation (not tetrahedralization) on the X/Z plane. For many uses this may be sufficient. (b) Support only fixed grid layouts, similar to Blender's Irradiance Volumes. (c) Use a simpler but more expensive interpolation method, and require users to limit the number of probes more closely.

Presumably this code will be in examples/js[m]/, rather than the core library, so while I'd like to have a universal solution, partial solutions may be a good starting point.

donmccurdy commented 5 years ago

Here is the relevant code:

https://github.com/mrdoob/three.js/compare/dev...donmccurdy:feat-lightprobevolume

In particular, LightProbeVolume.build.

donmccurdy commented 5 years ago

A Comparison of Five Implementations of 3D Delaunay Tessellation, by Yuanxin Liu and Jack Snoeyink, 2005. (predates Tetgen by a year)

donmccurdy commented 5 years ago

Ok, I misinterpreted the visual debugging output I included. After configuring it to render individual tetrahedra instead of just the wireframe I can see what's going on:

Screen Shot 2019-04-21 at 12 30 57 AM

(only a few cells shown for clarity)

The interior tetrahedra are just fine. The coplanar points at the boundaries of the volume are creating very thin tetrahedra, which aren't helpful but can be worked around or removed.

This approach is looking feasible.

bhouston commented 5 years ago

Yes, the weird tetrahedral on the outside are well weird, but who cares. This is amazing. Are you going to pre-interpolate the SH via the CPU so that you can just pass in a single SH to each mesh for lighting? This is amazing btw. Holy crap, Three.JS is going GI. :)

donmccurdy commented 5 years ago

Are you going to pre-interpolate the SH via the CPU so that you can just pass in a single SH to each mesh for lighting?

I think so. We could pass all 4 probes from the current cell and interpolate in the vertex shader, but (1) there are other ways to handle larger meshes, and (2) I'm not confident that approach would be correct or continuous near cell boundaries.

donmccurdy commented 5 years ago

Several issues remain (in particular, a lot of flickering...) but this is starting to look functional:

33D3B994-0ECB-4ED8-9FF2-F93A12DE2332-39966-0000693E83BFE7D7

The API could use some consideration. I'm assuming that this code will live in examples/js[m] rather than src/, and that users' applications can decide when to update a particular mesh:

import { LightProbeVolume } from 'three/examples/jsm/probes/LightProbeVolume.js';

var volume = new LightProbeVolume()
  .addProbe( probe1 )
  .addProbe( probe2 )
  .addProbe( probe3 )
  .build();

scene.add( volume );

function render () {

  volume.update( mesh );

  renderer.render( scene, camera );

}

This has some weird facets:

bhouston commented 5 years ago

Both LightProbe and LightProbeVolume extend Object3D. I'm not sure LightProbes should be (directly) added to the scene graph at all. But if not, and they remain in src/, what is their role there?

I think that LightProbe only exists in the scene graph during editing. At run-time it should generally not be there. It is a luxury convenience for designing only. Sort of like Geometry, it is a luxury you use when you do not care to be fast or scalable.

How does the interpolated SH value get passed to the renderer? mesh.diffuseProbe? mesh.diffuseSH? mesh.material.diffuseSH?

I think that diffuseProbe should exist on the mesh at least in the long term, but for now I think it isn't too important. Thus this is great.

Do probe intensities affect interpolation? If a probe's intensity is 0, does that mean it has no weight during interpolation? Or that it has its normal weight, but its coefficients are considered 0? I'm not sure per-probe intensity and color make sense to me here.

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality. If anything, I would make intensity derived from the average intensity of the SH and if you set it, it just derives the current intensity and scales its appropriately. Thus there is no intensity value outside of the SH values themselves. Otherwise it is an unnecessary degree of freedom that just complexifies things.

donmccurdy commented 5 years ago

I think that LightProbe only exists in the scene graph during editing. At run-time it should generally not be there.

I agree with the spirit of this, but functionally it implies...

... is that too much / overloaded? Does editing necessarily require that the probe be in the scene graph? To preview the effect of a change the LightProbeVolume needs to be rebuilt anyway, so maybe having a LightProbeVolume and LightProbeVolumeHelper might be sufficient for editing purposes.

I think that diffuseProbe should exist on the mesh at least in the long term...

Ok, I'm fine with that. It feels a little unnatural that the result of interpolating among probes should be another probe (I guess I expect the output to be an SH?) but I don't feel strongly about this.

Do probe intensities affect interpolation?

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality.

@WestLangley @mrdoob would you be OK with removing intensity and perhaps color from the LightProbe class? I think we've agreed that probes do not need to extend Light, and these properties complicate things a bit.

mrdoob commented 5 years ago

@WestLangley @mrdoob would you be OK with removing intensity and perhaps color from the LightProbe class?

We already removed color. I'd be okay with removing intensity too.

richardmonette commented 5 years ago

I'd be okay with removing intensity too.

👍

I think we've agreed that probes do not need to extend Light

👍

LightProbe could be added to a mesh, e.g. mesh.diffuseSH

👍 Agreed, I had envisioned this being set along the lines of https://github.com/mrdoob/three.js/pull/16270/files#diff-c0e88b98497597a015ecf238e91ac3a0R1442

To be clear, per mesh/object we would store a set of pre-interpolated SH coefficients, at least imho 😄

LightProbe could be added to LightProbeVolume (has position and SH, nothing else)

👍

users' applications can decide when to update a particular mesh

Would it be possible for three.js to implement this logic internally?

My sense was the update logic is the same for most (all?) applications (e.g. if an object/mesh moves then it needs an update, if the light probes move, then all the object/meshes may need an update, and maybe there is an update queue (so each frame one object gets updated, to not take too much time per frame)) If this is the same in most cases, I would suggest it would be valuable if this feature could come 'out of the box', so to speak.

bhouston commented 5 years ago

Would it be possible for three.js to implement this logic internally?

Yes it would be, but if this is in /examples/js, we should probably just do it manually for now until it moves into core.

donmccurdy commented 5 years ago

I suspect it's going to take a while to have really good workflows for designing and precomputing these probe volumes. Until we get there, having to manually call volume.update( mesh ) is a comparatively minor glitch in the developer experience. And to be honest I'm not sure I know how to do this automatically+efficiently yet, you really want at least some clear distinction between static and dynamic objects that doesn't exist today. For now it's nice to have the flexibility of working outside the renderer.

I had envisioned this being set along the lines of https://github.com/mrdoob/three.js/pull/16270/files#diff-c0e88b98497597a015ecf238e91ac3a0R1442

Yeah this looks like the right approach. 👍

WestLangley commented 5 years ago

I'm not sure LightProbes should be (directly) added to the scene graph at all.

I wanted a representation for ambient light that is more flexible than the single-color AmbientLight. LightProbe serves that purpose.

In the current implementation, LightProbe can be added to the scene graph as a source of indirect light. It extends Light, but that can be changed when all this settles down.

We currently have the following classes to the library:

LightProbe
AmbientLightProbe   // produces the same illumination as AmbientLight
HemisphereLightProbe    // produces the same illumination as HemisphereLight

And for convenience, LightProbeHelper.

Usage:

lightProbe = new THREE.LightProbe( sh, intensity );

lightProbe = new THREE.AmbientLightProbe( 'lightskyblue', intensity );

lightProbe = new THREE.HemisphereLightProbe( 'lightskyblue', 'saddlebrown', intensity );

lightProbe = THREE.LightProbeGenerator.fromCubeTexture( cubeTexture );

In addition, LightProbe can also act as a "probe", representing the illuminance at a particular location.

How does the interpolated SH value get passed to the renderer?

Add a light probe to the scene and update it on an as-needed-basis.

If there are many objects in your scene, then for now, you would have to use Mesh.onBeforeRender() to update the LightProbe.

Do probe intensities affect interpolation?

If you are using a LightProbe as a "probe", and not a source of light, then your can bake the intensity into the SH values if you want.

I am unsure why LightProbe has an intensity that is separate from its SH values. I do not understand it and I think it is unnecessary functionality.

The short answer is we are building a physical model, and units are important -- especially when renderer.physicallyCorrectLights = true;.

Consider this simple shader code:

illuminance = AmbientLightColor * light.intensity;

On the left is a physical quantity that has units of lux. On the right, is a Color, which is unit-less. That means intensity in this case has units of lux. All lights have units. SH and Color are unit-less.

If a LightProbe is a source of light, it must have an intensity property. Intensity is where the units come from.

LightProbe could be added to a mesh, e.g. mesh.diffuseProbe

We do not have consensus on the work-flow yet. For now, you would have to use mesh.onBeforeRender() to update the scene LightProbe parameters.

It feels a little unnatural that the result of interpolating among probes should be another probe

The interpolation work-flow can be implemented any way you want.

@WestLangley would you be OK with removing intensity and perhaps color from the LightProbe class?

No. intensity can not be removed if the LightProbe can be used as a source of light. However, color can be removed. That is a design decision that we can address later.

Finally, DiffuseProbe is not correct terminology, IMO. I would call it IlluminanceProbe. But that is not really necessary as LightProbe seems sufficient. The other kind of probe would be ReflectionProbe.

donmccurdy commented 5 years ago

I wanted a representation for ambient light that is more flexible than the single-color AmbientLight. LightProbe serves that purpose. ... In the current implementation, LightProbe can be added to the scene graph as a source of indirect light.

We shouldn't be thinking of LightProbes as sources of light, in my opinion. Rather, they are samples – omni-directional information about light passing through a point of empty space, which happens to be encoded as SH coordinates. Proximity to a LightProbe does not necessarily mean that the probe has any affect at all on a mesh; it only takes one intervening probe to entirely eliminate another probe's influence. Or at least, that should be the case in the end.

The case could be made that AmbientLight and HemisphereLight were never actually "sources" of light in the same sense that our punctual lights were. They're very simple models of global indirect illumination. That they're named as lights is probably making this all more complicated. 😓

Could you explain why the LightProbe API should be designed to resemble AmbientLight/HemisphereLight, and not the other way around?

We do not have consensus on the work-flow yet. For now, you would have to use mesh.onBeforeRender() to update the scene LightProbe parameters.

Ok. That's fine for now, although at some later point it presents issues: given a scene with 100 objects, the onBeforeRender function must be invoked 100 times per frame regardless of how many objects are currently moving.

Finally, DiffuseProbe is not correct terminology, IMO. I would call it IlluminanceProbe. But that is not really necessary as LightProbe seems sufficient. The other kind of probe would be ReflectionProbe.

I'm not attached to the term DiffuseProbe, any of these are fine. 👍 LightProbe and ReflectionProbe sound easily distinguishable to me.

donmccurdy commented 5 years ago

I'm warming up to the idea of intensity values on light probes, with a qualification – what if they were applied to the SH values at edit time? For example, if I create a layout containing 10 probes and set different intensities on each, the baking process (whether that's fromCubeTexture or something else) should use that intensity as a multiplier when writing the SH values. For runtime evaluation, the intensity can then be ignored.

Blender 2.8 gives reflection probes and irradiance probes an intensity property that is used exactly this way – modifying intensity after baking has no effect.

For example:

var probe = new LightProbe( intensity = 1.0 );

LightProbeGenerator.applyCubeTexture( probe, cubeTexture );
bhouston commented 5 years ago

I mostly agree with everything WestLangley said except these points....

Do probe intensities affect interpolation?

If you are using a LightProbe as a "probe", and not a source of light, then your can bake the intensity into the SH values if you want.

I would argue that LightProbes are never a source of light, but are always measures of light passing through, whether they are the vertices of the tetrahedral mesh or when you've interpolated to a specific point in space. Just like UVs are UVs whether defined on vertices or interpolated to a specific pixel of a fragment shader -- they are always UVs. Even when using AmbientLightProbe and HemisphereLightProbe these are not sources of light, rather they are just statements of light passing through.

On the left is a physical quantity that has units of lux. On the right, is a Color, which is unit-less. That means intensity in this case has units of lux. All lights have units. SH and Color are unit-less.

I think we should have SH in lux when used as a light probe representation. So SH is not required to be lux but when it is used in this situation it is in lux. Just like a Vector3 is unitless but when used in a specific context can have units. SH when used as a light probe is nothing more than a spherical representation of intensity for each of the RGB channels in my mind.

I really want to have less variables in a diffuse light probe if we can represent everything with just an SH in a single natural representation.

Remember that we generally have color and intensity separate in Three.JS for HDR color values (emissive, lights) mostly because it is a UI problem to specify color correctly in a HDR fashion. HDR color pickers are not really a thing because we have LDR UIs. I think that we can represent again color and intensity in the UI separately for ambient light probes, so it is usable on LDR displays, but I would derive them from the SH when needed and when they are set calculate the SH based on them. Thus they are derived values rather than distinct values.

If a LightProbe is a source of light, it must have an intensity property. Intensity is where the units come from.

I think that it is never a source of light. When there is an ambient light probe it is again just a measure of light passing through -- a constant value and usually a manually set value, but it is still light passing through. I think that we should not think of them as light sources.

richardmonette commented 5 years ago

We shouldn't be thinking of LightProbes as sources of light, in my opinion. Rather, they are samples – omni-directional information about light passing through a point of empty space, which happens to be encoded as SH coordinates.

👍

Proximity to a LightProbe does not necessarily mean that the probe has any affect at all on a mesh; it only takes one intervening probe to entirely eliminate another probe's influence. Or at least, that should be the case in the end.

👍

The case could be made that AmbientLight and HemisphereLight were never actually "sources" of light in the same sense that our punctual lights were. They're very simple models of global indirect illumination. That they're named as lights is probably making this all more complicated. 😓

💯💯💯👍

(An interesting project, diversion as it would be, could be to fix the way AmbientLights are handled by tetrahedrally interpolating (instead of the current incorrect additive model) Doing this would be in essence Order-0 SH with localized interpolation. This is, imho, is an important realization. That is; LightProbes are not additive like "normal" lights, they need a very different interpolation scheme, which results in one interpolated set of SH per object.)

https://github.com/mrdoob/three.js/issues/16228#issuecomment-486528684

I think that it is never a source of light. When there is an ambient light probe it is again just a measure of light passing through -- a constant value and usually a manually set value, but it is still light passing through. I think that we should not think of them as light sources.

👍

https://github.com/mrdoob/three.js/issues/16228#issuecomment-486631932

bhouston commented 5 years ago

@richardmonette wrote:

(An interesting project, diversion as it would be, could be to fix the way AmbientLights are handled by tetrahedrally interpolating (instead of the current incorrect additive model) Doing this would be in essence Order-0 SH with localized interpolation. This is, imho, is an important realization. That is; LightProbes are not additive like "normal" lights, they need a very different interpolation scheme, which results in one interpolated set of SH per object.

Exactly. But I would actually still represent them as 9 coordinate SHs because it is just easier and then we can generalize it into the LightProbeVolume instead of special casing ambient lights as 1 coordinate SHs that fits outside of the general model.

WestLangley commented 5 years ago

Here is a scene illuminated by a single AmbientLight only.

Screen Shot 2019-04-25 at 10 45 51 AM

Last week, I added the capability for a scene to be illuminated only by a light probe -- with varying intensity no less. Screen Shot 2019-04-25 at 10 44 22 AM

And it can be used without specifying a light volume.

Would you like me to remove this feature?

WestLangley commented 5 years ago

@bhouston This is code you wrote ( or maybe we wrote it together ):

// accumulation
reflectedLight.indirectDiffuse = getAmbientLightIrradiance( ambientLightColor );

A light probe can have two purposes, (1) to measure irradiance and (2) to model irradiance.

In the shader, it models indirect light. That is, light from indirect sources. So-called "indirect diffuse light" is indirect light that is reflected diffusely.

In our physical model, indirect light is additive to total scene light, and in that sense, an ambient light or light probe is a "source" of light. And in a physical model, it must have units: lux in this case.

I see nothing incorrect about referring to indirect light sources as "sources".

WestLangley commented 5 years ago

Here is something you may find more appealing...

If you want, we could just enhance the capabilities of AmbientLight so it encompasses HemisphereLight and SH parameterizations. I am not sure what the API would look like, though.

Then, we would only use the term "light probe" when modeling irradiance volumes. We wouldn't add light probes to the scene, directly, and they would not have to extend Light or Object3D.

You may still get into trouble if you remove intensity from LightProbe, though. It will depend on the to-be-determined workflow.

bhouston commented 5 years ago

In our physical model, indirect light is additive to total scene light, and in that sense, an ambient light or light probe is a "source" of light. And in a physical model, it must have units: lux in this case.

I see your point.

If you want, we could just enhance the capabilities of AmbientLight so it encompasses HemisphereLight and SH parameterizations. I am not sure what the API would look like, though.

Maybe we can add SH parameterizations to the existing AmbientLight and Hemispheric light and then the lighting code could sum up those SHs and also the SH on the mesh.lightProbe value and pass that into the shader as a single SH representing all indirect diffuse, e.g. a indirectDiffuseSH parameter?

The net effect is same API Ambient + Hemisphere lights, but simplified shader code and a bit of changing in the lighting code.

Just brainstorming. I should get back to work. Discussions with smart people is a bit addictive.

donmccurdy commented 5 years ago

Last week, I added the capability for a scene to be illuminated only by a light probe -- with varying intensity no less. ... And it can be used without specifying a light volume.

I do support being able to use a light probe or SH without specifying a light volume, and am not suggesting we remove that. But I would prefer to find another API for it, something like https://github.com/mrdoob/three.js/issues/16228#issuecomment-486736376 is promising. I'm not sure what that should look like either, but @bhouston's suggestion above sounds interesting too.

In our physical model, indirect light is additive to total scene light, and in that sense, an ambient light or light probe is a "source" of light. And in a physical model, it must have units: lux in this case.

Yes, but aren't these units describing the evaluated illuminance for a specific direction? So the probe can produce a physically-based value, but when assigning probe.intensity we are not assigning a physical value. We are assigning a scaling factor that will produce different physical values later. Does this definition seem correct?

"In physically-correct mode, the product of the SH function and the intensity is an irradiance value measured in lux."

You may still get into trouble if you remove intensity from LightProbe, though. It will depend on the to-be-determined workflow.

Ok, I'm happy to leave intensity alone for now. 😇

WestLangley commented 5 years ago

I wanted a proof-of-concept, to make sure the spherical harmonics interpolated as we would expect, so I started with @donmccurdy's example, and modified it a bit.

First, the mesh is illuminated with only a single AmbientLightProbe, which is a generalization of AmbientLight.

// ambient light ( the only light source in this demo )
lightProbe = new THREE.AmbientLightProbe();
scene.add( lightProbe );

Second, the ambient light is updated each frame based on the mesh's position relative to the Probe Volume.

Third, the interpolation method is just 4 or 5 lines of code. It works very well. I did not implement tetrahedral interpolation.

Screen Shot 2019-04-27 at 12 19 50 AM

Temporary live llnk: https://raw.githack.com/westlangley/three.js/dev-light_probe_volume/examples/webgl_lightprobe_volume.html

WestLangley commented 5 years ago

I plan to file a PR that adds a GILight class which is SH3-based. (I am open to changing the class/method names.)

New is GILight -- a scene light similar to AmbientLight; it is a source of light.

Also new is GIProbe -- a light probe. A probe is not a source of light, but can be used in a Light Probe Volume to store a representation of irradiance at a given location.

Probes can be baked -- from a Texture or from scene lights.

To summarize:

These classes remain unchanged, and still work -- at least for now:

var light = new THREE.AmbientLight( color, intensity );

var light = new THREE.HemisphereLight( skyColor, groundColor, intensity );

GILight is new, and has a spherical harmonics representation:

var giLight = new THREE.GILight();

GILight has these methods (so far)

giLight.setFromColor( skyColor, intensity ); // replicates an AmbientLight

giLight.setFromHemisphereColors( skyColor, groundColor, intensity, /* up */ ); // replicates a HemisphereLight

This new SphericalHarmonics method is relegated to examples/js/math/ for now:

THREE.SHUtils.setFromCubeTexture( sh, cubeTexture );

GiProbe is a light probe. It shares code with GILight:

lightProbe = new THREE.GIProbe( sh, intensity );

lightProbe.position.set( 10, 0, 0 );

And for visualization, there is LightProbeHelper (or GIProbeHelper):

var helper = new THREE.LightProbeHelper( lightProbe, 1 );

Suggestions are welcome for alternate names for GILight. (SHLight? ugh.)

Also for GIProbe. (LightProbe, is one obvious option.)

donmccurdy commented 5 years ago

Based on the two goals we discussed earlier...

  1. an SH-based object in the scene
    • conceptually a light source, not a probe
    • helpful for static model viewers
  2. probes for storing precomputed GI
    • helpful for dynamic objects moving in a scene
    • allows various interpolation methods, some flexibility here

... the GILight and GIProbe names sound quite good to me, I would not be opposed to using those. Other ideas for GILight: GlobalLight, GISource, SkyLight, SphericalLight. Of these, SkyLight is the only term for which I can find any existing precedent: "Prior to Unreal Engine 4.18, Static Sky Lights used to be represented ... with a 3rd Order Spherical Harmonic."

mrdoob commented 5 years ago

GlobalLight is the one I feel more attracted to. "Sky" makes me think of outdoors, but this light would also work for indoor settings.

WestLangley commented 5 years ago

FWIW, a few comments for clarification...

  1. I recently added SH3-based LightProbe. It serves as a source of ambient (indirect) light; that is, as a model for global illumination. It can also be used in a Light Probe Volume to store measured directional illuminance. It can serve two purposes.

  2. I thought LightProbe was reasonable nomenclature, because we had discussed replacing the envMap nomenclature with ReflectionProbe at some point, so we would eventually have scene light probes and scene reflection probes (both local and distant reflection probes). Probes would serve as sources of light.

  3. However, there was apparent resistance to using the concept of a "probe" as a light source. (I do not find it a problem, however, and I am not sure how we will handle reflection probes now if they, too, can not serve as a light source. That is a problem.)

  4. So, as an alternative, I proposed GILight and GIProbe. They are really the same thing, except adding a GILight to the scene will increase overall illumination, while adding a GIProbe to the scene will not (because it is not a light). Also, GIProbe is intended to be positional, GILight is not.

  5. Instead of GILight, I also considered reusing AmbientLight, but ensuring backwards-compatibility caused problems for me, so I abandoned that approach.

WestLangley commented 5 years ago

GI in GILight stands for "global illumination". Removing the "illumination" reference and calling it a GlobalLight seems unclear to me.

WestLangley commented 5 years ago

Thinking long-term, I still like using the term "probe "-- even for scene lighting (i.e., sources of light).

LightProbe (non-positional)

ReflectionProbe (non-positional, treated as infinitely far away, like and environment map)

LocalLightProbe (would be used in Probe Volumes, but maybe scene lighting, too)

LocalReflectionProbe (has a position and scale; supports parallax)
bhouston commented 5 years ago

I still favor the approach I suggested here. https://github.com/mrdoob/three.js/issues/16228#issuecomment-486747322 Do not really change the existing light definitions for Ambient and Hemisphere but just convert them to SH during the lighting consolidation and push them as part of diffuseIndirectSH (which also includes the LightProbeVolume results) when setting the shader.

I think that ReflectionProbe, if there is one, acts as global, infinite away. I think that DiffuseProbe, if there is only one, acts as global, infinite away.

AmbientLight is additive to whatever is in LightProbeVolume. HemisphereLight is additive to whatever is in LightProbeVolume.

Thus I advocate not changing the API for AmbientLight and HemisphereLight and just adding LightProbeVolume and DiffuseProbe. And we sum up all three into a single indirectDiffuseSH variable for the shader.

But I am okay if you go the route you guys are going, but from my point of view it seems complex.

mrdoob commented 5 years ago

@bhouston

I don't think we're removing AmbientLight nor HemisphereLight, we're trying to name the new type of indirect light source @WestLangley showed here: https://github.com/mrdoob/three.js/issues/16228#issuecomment-48670988).

Maybe I'm missing something though. In order to achieve @WestLangley's result using LightProbeVolume, the user will have to create a LightProbeVolume and then add a single LightProbe to it, right? And... because there is only one LightProbe, it won't matter if the model is not in the center of the volume, correct?

bhouston commented 5 years ago

Sorry, I apologize for being wrong.

I think that yes, that is one way it would work is to set it into LightProbeVolume and since there is only one, it would act as global. This is the least amount of code. Maybe it is too awkward from a user stand point?

donmccurdy commented 5 years ago

I think the idea is that GILight (or GlobalLight) could be used without any LightProbe or LightProbeVolume involved. The user has some SH coefficients (from a cubemap, or cmgen, or derived from a light arrangement) and just wants to use them globally.

The LightProbe and LightProbeVolume constructs would be optional, and only intended for users who are going to need more than one probe to light the scene.

I am not sure how we will handle reflection probes now if they, too, can not serve as a light source...

Taking the GILight / GIProbe relationship for diffuse irradiance, I do not think there is any need for a GILight equivalent for reflections. A ReflectionProbe could be added to a volume, but would affect the scene by some means that does not involve adding an Object3D source into the scene hierarchy.

GILight provides a temporary way for a LightProbeVolume to apply interpolated SH coefficients using each dynamic mesh's onBeforeRender callback, but in the long term it should be adding something directly to the mesh, like mesh.indirectDiffuse.copy( sh ), and would not rely on GILight at all. Similarly, a ReflectionProbe should not require the existence of a ReflectionLight.

mrdoob commented 5 years ago

Okay. So, considering this approach:

Do not really change the existing light definitions for Ambient and Hemisphere but just convert them to SH during the lighting consolidation and push them as part of diffuseIndirectSH (which also includes the LightProbeVolume results) when setting the shader.

We could then have a new type of light GlobalIlluminationLight (too many l's and i's together...) that could also be included in the light consolidation.

Another approach @WestLangley and I considered was to make AmbientLight more powerful. By default AmbientLight is a constant color, but we could do something like new AmbientLight.fromCubeTexture( texture ). We wouldn't need a new type of light that way.

@bhouston how does that sound?

donmccurdy commented 5 years ago

Another approach @WestLangley and I considered was to make AmbientLight more powerful.

That makes sense, too. 👍

richardmonette commented 5 years ago

What happens when there are 2 or more AmbientLight's in the scene?

donmccurdy commented 5 years ago

This would be inadvisable, but I guess the SH coefficients would be summed? I think that's consistent with what multiple AmbientLight and HemisphereLight instances would do today.

WestLangley commented 5 years ago

Many thanks to everyone for your feedback.

There are now no new light types. The three classes relevant to this discussion are:

AmbientLight

LightProbe

LightProbeHelper

AmbientLight now has a spherical harmonics representation and is backwards-compatible:

var light = new THREE.AmbientLight( color, intensity ); // flat color

AmbientLight has these new methods (so far):

ambientLight.setFromColor( skyColor ); // if you need to change the color

ambientLight.setFromHemisphereColors( skyColor, groundColor /* , up */ ); // replicates a HemisphereLight

This new SphericalHarmonics method is relegated to examples/js/math/ for now:

THREE.SHUtils.setFromCubeTexture( ambientLight.sh, cubeTexture );

LightProbe now shares code with AmbientLight:

lightProbe = new THREE.LightProbe( sh, intensity );

lightProbe.position.set( 10, 0, 0 );

And for visualization, there is LightProbeHelper:

var helper = new THREE.LightProbeHelper( lightProbe, 1 );

How does that sound?

mrdoob commented 5 years ago

I think it sounds good to me.

Just one thing:

ambientLight.setFromHemisphereColors( skyColor, groundColor /* , up */ ); // replicates a  HemisphereLight

Wouldn't it be better to just make HemisphereLight use sh too? Or I guess the problem is that we wouldn't know if the user is moving the light around and we'd need to update the coefficients?

WestLangley commented 5 years ago

I guess the problem is that we wouldn't know if the user is moving the light around and we'd need to update the coefficients?

Yes, the coefficients would now store all the information, instead of the two colors and position.

AmbientLight can now replicate HemisphereLight, so we could remove HemisphereLight from the library.

Maybe we can figure how to keep the HemisphereLight API as a wrapper around AmbientLight.

BTW, HemisphereLight should use the .up property, not the .position property, for the "tilt" effect.

FishOrBear commented 4 years ago

ref:https://www.comp.nus.edu.sg/~tants/gdel3d.html