aframevr / aframe

:a: Web framework for building virtual reality experiences.
https://aframe.io/
MIT License
16.65k stars 3.96k forks source link

Best practices for color space management #3509

Closed donmccurdy closed 5 years ago

donmccurdy commented 6 years ago

A-Frame 0.8.0 brings in some upstream fixes to THREE.GLTFLoader that are technically correct, but potentially inconsistent with existing scenes and examples. Pre-0.8.0, gltf-model assumed all textures were in linear color space — material.color and material.src still make this same assumption today. That's incorrect — images are usually sRGB, and the glTF spec explicitly requires color and emissive textures to use sRGB.

Following best practice:

  1. Textures are converted sRGB -> linear at the beginning of the fragment shader if the image encoding is set to sRGB.
  2. Renderer calculations are done in linear space.
  3. Renderer output is converted linear -> sRGB for the screen if renderer.gammaOutput=true.

By default in A-Frame and three.js, renderer.gammaOutput is false and step (3) never happens. As a result, glTF models look darker than they should. Textures using material.src look mostly right, because we don't set the encoding to sRGB, even though the images usually are sRGB. So the colors are being treated as linear in lighting calculations, which practically isn't very noticeable.

User danilo in Slack created this PDF, which illustrates the problem nicely:

https://drive.google.com/file/d/1BRdCU9nDzUHOfDYYXdbAV1cMW60Md6uv/view

I don't have any quick answers on what we should do here... in short, I think the three.js defaults assume some familiarity with color spaces, and ideally we would abstract this away from our users, or at least provide some best practices as a reference for users who are going to care about matching a color palette precisely.

If we were to change our defaults, here's a straw proposal:

The one thing this doesn't fix is material.color values, which are assumed to be already linear in three.js, and don't have an encoding setting. So either (1) we could provide guidance about converting an sRGB hex code to linear, or (2) we would automatically do this in the material component.

I'm also not sure what this all means for vertex colors...

My vote for first step would be to add srcColorSpace and emissiveColorSpace in a PR, verify that it improves outcomes, and then enable them without changing any defaults. It would be nice to have feedback from someone more familiar with color spaces and/or rendering pipelines. 🙂

webdan76 commented 6 years ago

Thanks, Don. As the document is publicly available now, for the sake of honesty and clarity, I would like to add that the texture I used in that PDF was created by the Sketchfab user proxy_doug.

In particular, it was included in the assets I downloaded to create my submission for the Mozilla WebVR Medieval Fantasy Experience Challenge, and this is the link to his collection:

https://sketchfab.com/models/6429e3f908f74be3bd157b5cb0c01e81

donmccurdy commented 6 years ago

Maybe an even more user-friendly option would be creating an option renderer="colorSpace: sRGB" that (1) sets gammaOutput=true, (2) sets texture.encoding = THREE.sRGBEncoding for diffuse and emissive maps, (3) automatically converts material="color: <color>;" from sRGB to linear space. This is a bit more complex, however, and user-land components would still need to convert any colors they create themselves.

donmccurdy commented 6 years ago

I recommend that we do all of the above except (3) in a future release.

By doing all of these at the same time, we'll minimize broken scenes on upgrade — images and models should appear a bit less washed out, but not significantly darker or lighter. Directly given colors like material="color: #f00" will appear lighter than before, but I'm not confident about changing that default.

EDIT: Unity has some really nice references on the differences in lighting when using linear or sRGB colorspace: https://docs.unity3d.com/Manual/LinearRendering-LinearOrGammaWorkflow.html

donmccurdy commented 6 years ago

For anyone stumbling across this issue before it gets fixed, here’s a hacky workaround to mark non-glTF images as sRGB:

sceneEl.addEventListener('loaded', function (e) => {
  sceneEl.object3D.traverse((o) => {
    if (o.isMesh && o.material.map)  {
      o.material.map.encoding = THREE.sRGBEncoding;
      o.material.needsUpdate = true;
    }
  });
});

Combined with renderer=“gammaOutput: true” on the scene, that should give consistent and correct colors for all images in the scene.

donmccurdy commented 6 years ago

One possible first step here: https://github.com/aframevr/aframe/compare/master...donmccurdy:feat-colorspace

With that branch, users may write the following:

<a-scene color-space="sRGB" renderer="gammaOutput: true, physicallyCorrectLights: true">
  ...
</a-scene>

Result:

  1. All color-related textures loaded by gltf-model componen will have correct colorspace. (already the case)
  2. All color-related textures loaded by material component will have correct colorspace.
  3. Lighting (from upcoming glTF extension) will have correct attenuation.
  4. Output to screen will have correct colorspace.

But that's not a backward-compatible change for most scenes. Vertex colors and material="color: #ff0000" properties will appear significantly lighter in this mode, and that's hard to automatically fix. I'd suggest we make a correct mode available, as above, recommend it for users who run into these issues, and work toward using it in our own examples.

Things I'm still not sure about:

  1. Could gammaOutput: true be combined with the colorspace option? Maybe, but you might want to use colorspace without gammaOutput if you're using post-processing, too.
  2. Should colorspace be a property of the renderer? Also maybe, but the renderer component doesn't load until after all of the material components, so it would be kind of hacky to get the right data at the right time.
colinfizgig commented 6 years ago

I tried your workaround and it is definitely closer to correct lighting but the scene material colors made things wash out and adding a environment reflection makes it wash out even more. Unity seems to deal with this in a transparent way across the scene. setting the color space in the camera toggles the look based on what the scene is set to. This is like setting iso or white balance on a camera (kind of) in that you set the effect and see the result to decide if you like it. As it currently stands I can light around Aframe's colorspace darkness, but the way reflections suddenly pop with the gammaoutput: true makes it too washed out to be useful. and tweaking between gltf materials and colors on the material component is a major pain. I'll try your feat-colorspace to see what its like.

colinfizgig commented 6 years ago

So I saw the same results with feat-colorspace. Here are screen shots to show a comparison between current colorspace and lighting and what you have with feat-colorspace.

image

I'm pretty sure I can light around whats currently going on with the colorspace, but the Gamma wash makes it hard to correct the tones. And it's worse with reflection maps. I can see where gamma could be useful for post effects like Color corrections with a LUT or something but I don't know if it will get much use if it makes getting good results with materials and lighting harder.

colinfizgig commented 6 years ago

Something else to consider as I posted that image. The look of the render changes depending on the browser and screen settings. What I notice with the feat colorspace is that it seems to de-contrast the image in a global way. Which makes it harder to light a scene. Most people set the brightness and contrast on their monitors to work for standard web page contrast and text. Not to bright, but bright enough to wash out gamma corrected images.

donmccurdy commented 6 years ago

Haven't got time to write full details just yet but a couple quick notes:

About the "de-contrasting", I think what you're seeing is related to the points above and the (intended) outcome of linear rendering — brightness of the final image is linear with amount of light in the scene. See Unity's details here: https://blogs.unity3d.com/2017/07/17/linear-rendering-support-with-webgl-2-0/. Unity presumably handles all of this with less direct user input than converting colors by hand, as I suggested above, and I'm not sure how to accomplish that. :/

I would separate all of this from post effects, LUT, and so on... the goal is to have default settings that ensure textures and colors match their appearance in the tools they were created with. If users want to apply tone mapping and so on to customize after that, it's certainly fine.

donmccurdy commented 6 years ago

Also ...

The look of the render changes depending on the browser and screen settings.

And it's worse with reflection maps.

... that's definitely not intended. Could you share examples, if these aren't just taken from the examples in this repo?

colinfizgig commented 6 years ago

I like the goal of getting the lighting consistent between apps. I think it should be somewhat automatic as a default, because most people won't go down the rabbit hole to figure out the pipeline between apps. I'm not against this at all. As someone who strives for Photo Realism in VR I'm for anything that helps with lighting. I've also learned that what people expect is Photo real is actually color managed "Chemical Film Look" with Bloom and some grain for good measure. I'm being slightly sarcastic but there is a lot of truth in the statement. My point was to say whatever the solution is it should take the color management approach used by most Graphics programs and be somewhat transparent to the user as well as be the best "Average" of a good look across most display methods.

So it should be a consistent look between Three.js, Aframe, and tools like Maya and Blender. In addition it would be good to test the solution over a variety of monitors and types to make sure the look is somewhat similar with weighted importance of consistency going to VR Headsets and Computer Monitors, then Mobile and other screens. In addition I'd recommend checking across browsers and tools like Photoshop, because you will see a difference between a texture in Photoshop or Gimp and what is seen on the Monitor depending on their Color Management Systems. Probably the same on tools like Substance Painter.

Here is an example of the differences I can see between programs and the browser as well as color management. It's not a great difference except for the difference between Apple and PC, but the difference I saw in your Gamma build of Aframe was much more pronounced.

pariscolormanagement

colinfizgig commented 6 years ago

I'll post a difference pic with my arcade cabinet to show the difference in reflections in a bit.

colinfizgig commented 6 years ago

Here is two more tests showing what happens with an envMap applied between current and your SRGB build.

basic scene srgb vs current arcadeaframe srgb tests

I see where no envmap makes the color show up better on the arcade cabinet, but this means that the metal and roughness maps aren't really being used since there is nothing for them to reflect. In addition flat shaded textures get washed out. Not sure what the issue causing the mismatch is.

donmccurdy commented 5 years ago

@colinfizgig see https://github.com/aframevr/aframe/pull/3757 and https://github.com/mrdoob/three.js/issues/11337. I think we're making progress on this. Also note that in the upcoming version of three.js there are some changes to how LDR environment maps are handled, per https://github.com/mrdoob/three.js/issues/15285... it seems like an LDR envMap wasn't accounting for gamma correction before.

donmccurdy commented 5 years ago

I think we can consider this closed by https://github.com/aframevr/aframe/pull/3757, although I'm sure some further iteration will be necessary. In particular, I'd like to move more of the color management to the three.js level in the future, so that non-core components will be color-managed by default.