Hi, I'm testing out the pretrained model output with 3js lights, placing them into a scene andsetting the normal map to object space, but it seems that it's reading the normal map colors at an angle of some sort
E.g, in this image, I have a point light (exaggerated brightness) at a position of [0, -Y, -Z]. I'd expect the light to hit the image full on, but it seems to render a dark diagonal patch on the left (and it seems to do this consistently across all images).
Hi, I'm testing out the pretrained model output with 3js lights, placing them into a scene andsetting the normal map to object space, but it seems that it's reading the normal map colors at an angle of some sort
I saw https://github.com/EPFL-VILAB/omnidata/issues/13, but even flipping the Y&Z axes still creates a diagonal pattern of the light reflection.
E.g, in this image, I have a point light (exaggerated brightness) at a position of [0, -Y, -Z]. I'd expect the light to hit the image full on, but it seems to render a dark diagonal patch on the left (and it seems to do this consistently across all images).
Do you know what might be happening here?