While testing edge cases for the glTF KHR_lights_punctual extension (based on the same Frostbite paper we cite in our code), I found behavior in the three.js light system that surprised me.
In all tests, the renderer is in physically-correct mode:
In three.js, it appears that when light.distance = 0, there is no distance-based attenuation applied to the light. I'd assumed that inverse-square attenuation was used in physical mode when .distance=0, and when .distance>0 we either (a) switch to linear attenuation (as stated by the docs) or (b) fall off steeply to 0 at the distance cutoff (undocumented, but seems to be the aim of the code).
Here are two demos. In the second, I've disabled the angular factor to irradiance, and it's obvious that no distance-based attenuation remains:
The grid is 10x10, and all lights are point lights.
case
screenshot
distance=0
distance=25
target
If this is intentional, we should prominently state in the docs that physically-correct lighting requires a non-zero distance cutoff.
Current docs:
distance: If non-zero, light will attenuate linearly from maximum intensity at the light's position down to zero at this distance from the light.
Suggested wording (for current behavior):
distance: If non-zero, light will attenuate linearly from maximum intensity at the light's position down to zero at this distance from the light. If zero, light does not attenuate. For physically correct lighting, distance should be non-zero and large enough that the cutoff is as nearly unnoticeable as possible — light will attenuate according to inverse-square law up to this cutoff, and then drop steeply to zero.
But I'm not sure this is the right behavior, especially when 0 is the default distance value. In physically-correct mode it seems more intuitive for 0 to mean "I don't need a cutoff, just use inverse-square law" rather than "no attenuation".
While testing edge cases for the glTF
KHR_lights_punctual
extension (based on the same Frostbite paper we cite in our code), I found behavior in the three.js light system that surprised me.In all tests, the renderer is in physically-correct mode:
Then I've created these test scenes: https://github.com/KhronosGroup/glTF/files/2371644/LightTest.zip. The Blender output is not a good reference, but in this thread there are screenshots of Unreal, Unity, and BabylonJS that all match reasonably well.
In three.js, it appears that when
light.distance = 0
, there is no distance-based attenuation applied to the light. I'd assumed that inverse-square attenuation was used in physical mode when.distance=0
, and when.distance>0
we either (a) switch to linear attenuation (as stated by the docs) or (b) fall off steeply to 0 at the distance cutoff (undocumented, but seems to be the aim of the code).Here are two demos. In the second, I've disabled the angular factor to irradiance, and it's obvious that no distance-based attenuation remains:
The grid is 10x10, and all lights are point lights.
If this is intentional, we should prominently state in the docs that physically-correct lighting requires a non-zero distance cutoff.
Current docs:
Suggested wording (for current behavior):
But I'm not sure this is the right behavior, especially when
0
is the default distance value. In physically-correct mode it seems more intuitive for0
to mean "I don't need a cutoff, just use inverse-square law" rather than "no attenuation"./cc @bhouston