Open leonidaspir opened 3 years ago
To me the without batching screenshot looks strange .. do you use some manually bent normals or something? Either way, non-uniform scaling is causing it .. when I use uniform scaling, all is good. It seems this is something for @raytranuk to investigate.
I assumed it was related to this PR I've just done, but it's not. https://github.com/playcanvas/engine/pull/2836
To me the without batching screenshot looks strange .. do you use some manually bent normals or something? Either way, non-uniform scaling is causing it .. when I use uniform scaling, all is good.
Yes the modeler added a slight curve on that face hence the extreme shading when scaled on Y, that's on purpose.
Does batching recalculate normals? Even if it does, any idea why the same happens when using hardware instancing? I wouldn't expect any change on the rendered mesh with HW instancing.
Batching only transforms normals to world space (static batching anyways, dynamic does not). Hardware instancing does not touch the mesh at all. All this points out at the shader .. @raytranuk could have some ideas, I'm not sure why would normals change direction .. perhaps the normal was not normalized in the mesh and shader normalizes it and this is what happens?
Release v1.39.0 that includes a fix for #2836 seems to have resolved this issue when using static batching. But when using dynamic batching or hardware instancing the issue with the lighting remains.
To clarify, the normals posted on the before photo are the correct ones, it's an artistic decision.
I've updated the sample above to enable dynamic batching. I've also added a script that enables hardware instancing which reproduces the issue as well.
Without batching / Without HW instancing / With Static Batching (lighting is correct):
With dynamic batching / With HW instancing (lighting changes):
I am curious why it would change when HW instancing is enabled, since there isn't a new mesh created at all by the engine.
My theory - when you have a non-uniform scale on a mesh, then to transform object space normal into the world space, a transpose of inverse world matrix needs to be used.
Possible solutions to bring transpose inverse world matrix to shaders for case 3:
The performance impact here would perhaps not be worth this. In your specific case, you could reimport the mesh without non-uniform scale. Or even use static batching each frame. Or don't use any instancing / batching.
It seems there is a change in lighting with a number of models we are using in our game when using batching. I wasn't able to isolate why these models produce this change. The same happens both when using CPU batching in editor and hardware instancing with a coded solution.
Without batching:
With batching:
Same project url: https://playcanvas.com/editor/scene/1084394
Tested on windows/android Chrome.