playcanvas / engine

JavaScript game engine built on WebGL, WebGPU, WebXR and glTF
https://playcanvas.com
MIT License
9.67k stars 1.35k forks source link

Lighting change when using batching/instancing #2832

Open leonidaspir opened 3 years ago

leonidaspir commented 3 years ago

It seems there is a change in lighting with a number of models we are using in our game when using batching. I wasn't able to isolate why these models produce this change. The same happens both when using CPU batching in editor and hardware instancing with a coded solution.

Without batching:

image

With batching:

image

Same project url: https://playcanvas.com/editor/scene/1084394

  1. Toggle on/off batching by setting and unsetting the batch group on the Platform and boardMain entities.

Tested on windows/android Chrome.

mvaligursky commented 3 years ago

To me the without batching screenshot looks strange .. do you use some manually bent normals or something? Either way, non-uniform scaling is causing it .. when I use uniform scaling, all is good. It seems this is something for @raytranuk to investigate.

mvaligursky commented 3 years ago

I assumed it was related to this PR I've just done, but it's not. https://github.com/playcanvas/engine/pull/2836

leonidaspir commented 3 years ago

To me the without batching screenshot looks strange .. do you use some manually bent normals or something? Either way, non-uniform scaling is causing it .. when I use uniform scaling, all is good.

Yes the modeler added a slight curve on that face hence the extreme shading when scaled on Y, that's on purpose.

Does batching recalculate normals? Even if it does, any idea why the same happens when using hardware instancing? I wouldn't expect any change on the rendered mesh with HW instancing.

mvaligursky commented 3 years ago

Batching only transforms normals to world space (static batching anyways, dynamic does not). Hardware instancing does not touch the mesh at all. All this points out at the shader .. @raytranuk could have some ideas, I'm not sure why would normals change direction .. perhaps the normal was not normalized in the mesh and shader normalizes it and this is what happens?

leonidaspir commented 3 years ago

Release v1.39.0 that includes a fix for #2836 seems to have resolved this issue when using static batching. But when using dynamic batching or hardware instancing the issue with the lighting remains.

To clarify, the normals posted on the before photo are the correct ones, it's an artistic decision.

I've updated the sample above to enable dynamic batching. I've also added a script that enables hardware instancing which reproduces the issue as well.

Without batching / Without HW instancing / With Static Batching (lighting is correct):

image

With dynamic batching / With HW instancing (lighting changes):

image

I am curious why it would change when HW instancing is enabled, since there isn't a new mesh created at all by the engine.

mvaligursky commented 3 years ago

My theory - when you have a non-uniform scale on a mesh, then to transform object space normal into the world space, a transpose of inverse world matrix needs to be used.

  1. rendering without batching or instancing - I believe shader uses transpose inverse world matrix, so all is ok
  2. static batching - in that PR you linked I transform normals to world space using transpose inverse matrix, so this works too
  3. dynamic batching and instancing - these have custom path for normals, where we only have object to world matrix, but we don't have transpose inverse version of it. And so when that is used to transform normals from object to world space, we get the issue.

Possible solutions to bring transpose inverse world matrix to shaders for case 3:

  1. for dynamic batching - we currently store single "bone", we could store another bone with it.
  2. for instancing, the instance vertex stream currently had 3x4 matrix, we'd need another set of attributes for additional 3x3 matrix.
  3. we could do transpose invert in the shader in those cases.

The performance impact here would perhaps not be worth this. In your specific case, you could reimport the mesh without non-uniform scale. Or even use static batching each frame. Or don't use any instancing / batching.